Going On the Edge

Leti’s CEO talks about edge AI, FD-SOI and other topics.


Emmanuel Sabonnadière, chief executive of Leti, sat down with Semiconductor Engineering to talk about artificial intelligence (AI), edge computing and chip technologies. What follows are excerpts of that conversation.

SE: Where is AI going in the future?

Sabonnadière: I am a strong believer that edge AI will change our lives. Today’s microelectronics are organized with 80% of things in the cloud and 20% on the edge. In five years from now, it will be reversed. It will be 80% on the edge and only 20% in the cloud. There is some rational behind this, telling us that it will go in this direction. It is a question of the privacy of data.

SE: We’ve heard a lot about the edge. What will drive it?

Sabonnadière: The big driver will probably be the healthcare business. The healthcare business is a grey zone today. Here, this is where AI and edge AI will completely change the relations you will have with a system.

SE: Can you give us an example?

Sabonnadière: I have one specific case. This is diabetes. So we can do a blood sugar measurement. We have a sensor, which is non-intrusive. We are able to measure it and give in comprehensive detail the blood sugar. We send the data into the AI system. The AI system makes 60% of the decisions. But even for 60%, the system knows what to decide and which protocol of insulin you need to inject. Eventually, the AI system will be solid enough to decide on a permanent basis what needs to be injected. That’s how I see AI completely changing the story. Especially, it will be driven by healthcare, because the demand is everywhere.

SE: This could be a big driver in other areas of healthcare, right?

Sabonnadière: We have two big departments in Leti. One is in medicine. The other is more on wellness. In both, the success is linked an AI system. You receive data. You decode the data and determine what it means for your health. Then, you will react. You may decide to lose weight. You may decide to sleep more. You may decide to take some medications or not. You will make decisions based on the AI.

SE: Google, Facebook and others are deploying a subset of AI, called machine learning, on the cloud. Most machine learning algorithms run as software on GPU-like processors, which consume energy. So we need to bring the processing closer to the systems in the so-called edge. Do we need specialized devices for the edge?

Sabonnadière: What we want is to have enough power inside to make sure the edge AI will turn and will collect some data. That will give a solution by edge AI without any connection to the cloud. For that, you need to introduce a balance between CMOS, which does the calculations, and the synapse or neuromorphic computing. It requires low power sensors that will collect the data. Those are 3D architectures we are designing either with Imec or at Leti. We believe you need to put less power in the CMOS and more connections between the CMOS and neural network or the memories. For neural networks, you have two schools. You have the convolutional one. At Leti, we believe in spiking technology. This is probably the one with the most interesting future. It learns from the start. From that, it’s incremental and well controlled. We’ve demonstrated the technology. You probably need less CMOS density. But you have more in the memory in the neural network.

SE: What types of technologies do you need here?

Sabonnadière: That allows me to use an FD-SOI platform on the CMOS part. I’m not jumping on 7nm finFETs or 5nm finFETs. I prefer to stay with FD-SOI, probably at sub-10nm. Today, with FD-SOI, we are at 18nm with Samsung. Probably, there is 12nm in the air. But we have to think about sub-10nm. Let’s say you have a sub-10nm FD-SOI on your chip. Then, you add a well-designed and optimized neural net using spiking technology. And then you have one or two sensors. You cut the power on it. You could have an MCU with less than 1 watt. That will be big. You could integrate several different sensors, such as MEMS sensors, bio sensors and all these kind of things.

SE: What are the challenges with all of this?

Sabonnadière: A neural network is a nightmare today. It’s not high quality. People are playing with some statistical methods. There are still things that need to be invented here.

SE: What’s the solution?

Sabonnadière: We need a lot of academics. I am pushing them to work in the cloud to fix the AI system with the right math, with large numbers of data and computing algorithms. When that is a good system here, I can move it down to the edge. Then, it will be hosted by electronic devices and chips that are well designed. This new AI system will re-learn with the local data and with much less data.

SE: You will need different architectures for this, such as advanced package types and chiplets, right?

Sabonnadière: Absolutely.

SE: Sounds like there is a lot to work to do.

Sabonnadière: Yes. I’m pretty sure for the next 10 years we will talk about AI and edge AI.


Stanimir Valtchev says:

A TFET could be also part of the solution, as its power of functioning is less than one hundreth of what a cmos consumes…

Gilbert De Guzman says:

It will need specific technology development

Mark LaPedus says:

TFETs were topical several years ago. See:

Tunnel FETs Emerge In Scaling Race

But TFETs have fallen off the roadmap. Too many challenges. Everyone is working on gate-all-around.

Leave a Reply

(Note: This name will be displayed publicly)