Imec’s chief strategy officer explains how ICs are being used today, and how they will impact medical treatment in the future.
Jo De Boeck, chief strategy officer and executive vice president at imec, sat down with Semiconductor Engineering to talk about the intersection of medical and semiconductor technology, what’s changing in how chips are being used, and what will happen in the short term and long-term. What follows are excerpts of that discussion.
SE: Medical technology never advanced at the rate everybody thought it would, largely due to regulatory hurdles. Where are we now, and how much progress has been made behind the scenes?
De Boeck: Going back 15 years ago, we were walking around with early prototypes of what we then called silicon-based biosensors. Those are basically any device that has a surface that reacts when a biospecimen comes close. We discovered that resistors, capacitors, and transistors all were sensitive biosensors. We missed a few important things, though. First of all, we didn’t have good insight into why and how these devices would have an impact. We started with a push version of what silicon could do, and we didn’t have a tradition of dialogue with the medical or life sciences fields. That has markedly improved over the past decade.
SE: In what way?
De Boeck: We have invested quite a bit in the dialog with the clinical/research fields around it. That also includes pharma, because there’s a lot of need to do process analytics, and diagnostics around that. The sheer power of integrated electronics as we know it, then and today, comes with high degrees of complexity, parallelism, and commoditized functions. So there are compute functions, imaging, and complex biosensing. And we now recognize that the key features for success in the broad life sciences and medical domains align with the advanced semiconductor roadmap — high complexity, parallelism, precision, and also lowering the cost and commoditizing things that typically are in the laboratory space — and then moving that out to doctors, specialists, and patients.
SE: How did massive compute power play into this?
De Boeck: A nice example of the impact of the massive power of the integrated electronics and photonics is the roadmap of DNA sequencing, where the cost of doing that sequencing in a reasonable amount of time came down fast. Advanced silicon integration process technology definitely makes a difference both on the devices to ‘read’ the DNA fragments and on the computing power needed to puzzle the read fragments together.
SE: The latter is really brute force computing, right? We are relying on hyperscale capacity to do things like drug discovery and vaccine development.
De Boeck: Yes, and brute force does help to the extent that you need to make sure the software pipeline you have for analyzing the data can keep up. We are shifting parts of the problem to compute, and brute force is one requirement. Another is precision. There is something to say about being precise on the types of measurements needed for life science and health care. One example in the life science discovery domain is the neural probe. People in the neuroscience community have been very active in their search for the new “microscope” to unravel the complexity of the brain. They are looking at the brain circuitry and what is happening in terms of connectivity inside the brain. It is very hard to understand what a chip is doing by measuring the activity in a single transistor, so you’re better off trying to find sub-structures in the circuitry. We’re more or less trying to do the same thing with the brain to understand how it functions, in order to complement the fantastic progress in understanding molecular principles guiding brain function and neurodegenerative conditions. A neural probe was developed to simultaneously measure thousands of neurons. This was pioneered in a rodent brain and became the gold standard in neuroscience. It’s going to be hard to publish any paper that looks at brain circuitry and functionality without using this type of probe. We can be specific in terms of which neurons to measure and which connectivity to assess based on that measurement. The neuroscience community is still coming to terms with how to make the most of that new level of insight, but it’s going to be more specific, more precise, and give more detailed data. Silicon technology, guided by a leading neuroscience consortium, was there to kick-start this revolution.
Fig. 1: Neural probes, version 1 (top) and version 2 (bottom), which can monitor neural activity over a period of weeks. Source: imec
SE: Any surprises when you really started analyzing the brain?
De Boeck: It’s interesting when engineers start making that comparison, making it sound like it’s actually a computer, memory and storage. In the brain, everything functions together. A very good example of where the neural probe currently is very useful is in understanding how we process novelty. The brain is processing images, but somehow it has to store what it has seen. We’re very good at recognizing what we have seen. We have a neuro electronics research facility, which we started with our colleagues at the Institute of Biotechnology here in Flanders, Belgium, and there is a group looking at novelty. Place cells play a key role in how we recognize where we’ve been and where we are. The brain is processing all of this on the spot, and then during a rest period, it’s re-processing that information — sort of like dreaming it up again. It’s strengthening the connections by redoing the learning process, strengthening those synapses and connections throughout the brain. Such technology-assisted research will find new ways of verifying many hypotheses around brain functions and degeneration. This allows the field to make leaps in progress in understanding how the brain works and how pathologies can impact those circuits. There’s a connection between molecular disorders in the brain and function, and that connection is through the circuitry. There’s a lot we still have to discover on the interplay of molecular activity and circuit formation supporting the actual functioning of the brain.
SE: How much progress have you made?
De Boeck: This is still in research. But in connecting the molecular side to the circuitry, we’re looking at brain cells on a multi-electrode chip surface, using the complexity that we can bring in the chip by connecting to cells, but also for programming stem cells, for instance. The idea is that we can build a circuit of neurons on the silicon chip surface. With the use of electroporation by electrodes and injecting biological vectors through microfluidics, we can program regular cells to become stem cells again. Then, with the right stimuli we can program those induced pluripotent stem cells to make them become neurons of a certain type. We have a project running with experts in the cell biology to master the process to build or mimic the neuronal circuitry of a human brain on a chip. The idea is that you could make circuits at will, and then see how a potential disorder or connectivity will function. Moving from the molecular level to the circuit level is a great step forward. We currently are doing this on a flat silicon chip surface, but we will venture into 3D soon. It’s far from commercial at this point, but it helps the scientists and the pathologists to come closer.
Fig. 2: Ingestible sensor. Source: imec
SE: Isn’t that pretty long-range?
De Boeck: Yes, and with technology that is coming close to humans, it’s always difficult. We’ve had 15 years of R&D on wearables. Now we’re moving into ingestibles and implantables, which has been a very engineering-intensive effort. You need a smart design with extremely low power consumption to make measurements on and through the skin for things like bio-markers for certain pathologies and conditions. We started that program before wearables were available. Today, they at least give you an indication of health status, and one of our recent startups is working on a very promising product for continuous glucose monitoring, which is one of the holy grails.
SE: That’s a big one for wearables, right?
De Boeck: Yes, and the key there is that it’s one of the very few that allows the patient to be in the loop. If you prick yourself and take measurements, then as a patient you know what to do. You don’t need to call a doctor or bring in an expert. But there are very few of those situations. Blood pressure is notoriously difficult to measure precisely without a mercury column. I can have an indication that my blood pressure is okay and the measurements are relatively good, and then my blood pressure medication is prescribed and I just take what I need to take. That’s not a closed loop.
SE: Because you’re only getting a reading at one very specific point in time?
De Boeck: Yes, and this is something the medical profession is still debating. You really need to have continuous blood pressure, but it’s hard to obtain those measurements. It requires high specificity and continuous measurement.
SE: There has been talk about a lab on a chip for years. Is there any progress there?
De Boeck: You can do a lot on chip, but that hasn’t yet replaced the need for the lab. You need diagnostics, high precision, reasonable turnaround — in some cases a couple days is okay, but in others it is not. You also need low contamination, high specificity, and a small sampling size. This means that you need to have the microfluidics under control. You need to have all of the measurement capabilities that you have in a lab miniaturized to an extreme level. And you need cartridges that are cheap. Our teams have been very active in building the technology for microfluidics microprocessors. You can imagine how it functions — the necessary, mixing, separating, analyzing functionality that you would have in large lab equipment coming into a cartridge, which could have a relevant price point. We have a start-up working on productizing this technology, and the ability to do a full blood panel or rapid COVID-like test for any early phase of a disease is a very appealing market proposition. Consumers using this technology will still be a challenge, but a lot of the technology has evolved to the point where we can see the parallelism with commoditizing complex compute and communication functions when the precision, the cost, and the use cases come together.
SE: Microfluidics has been discussed for quite some time. What else can it be used for?
De Boeck: The early work was for micro-cooling. Currently in our research program, in the overall setting of the semiconductor roadmap, micro cooling is there. It’s part of 3D integration technology, and we have an active program for implementing micro-cooling/microfluidics. You need to have a volume of coolant to cool the surface. But it requires another type of microfluidics to use in a microfluidic processor used to analyze biospecimens. It’s a different technology. One would need 3D printing to form channels and a 3D buildup, whereas the other would use surface technologies that could be made with lithography tools, or nano imprint on a large scale. You need microfluidics to be at a certain precision level so that implants could start using it. This becomes manufacturable at a larger scale, and then the price goes down.
SE: This may necessary for something like Google Glass, where you’re wearing it next to your head and you want it to be very low heat. But does any chip in your body need to be cooled, and cooled evenly?
De Boeck: In the brain it would be detrimental to raise the temperature. In those solutions you have to go really, really low power, which is one of the tracks they all follow. So low power design is extremely important. Cooling in the body is another thing, and maybe we could use the body’s cooling system or the body fluids, although we don’t have any research program there. With the sheer power you want on a next-generation smart AR/VR glass, you also may want some sort of active cooling to be to be part of it, but the better approach is to avoid the heating and go low-power at system level. System technology co-optimization will be needed. If you ask any of the algorithm or software people, or any of the chip design or technology engineers, separately what they would want to do to make future AR/ VR become reality, their disconnected individual solutions would be a disaster for whomever is wearing it.
SE: You bring up an interesting idea here, which is co-optimization. It’s no longer just one thing that’s going to solve the problem. You have to design the software, the hardware and the application together.
De Boeck: When we first started making biosensors, we made a mistake because it was way too expensive. So we started talking to the clinicians, the end users, and pharma companies to understand their process and what they do. When you deal with a patient and want to get some results back, what is it that they do? And how important is it to do that fast and with precision? That’s just one parameter. There may be a zillion of those. Then we go to the drawing table, and in the end you’re still making a chip. But the chip has a specific application in mind. Many of our technologists in a clean room don’t worry about designing anything. They’re focused on how thin a conformal layer in a nano-device should be, for example. We still have challenges at the materials level, but we also have to understand how the way we design will impact choices in technology, and vice versa. When I make transistors smaller in a different shape, I may have new design choices that I didn’t have before. This may very well trickle up to the algorithmic level, and the algorithm defines the application. All these elements need to go hand in hand. In the future, chips will not be made in isolation. And that’s very much the way we innovate for health-related topics, as well.
SE: For years we’ve designed for functionality, basically to get the chip working. Now we’re starting to design for data throughput and processing where it makes sense. This is a fairly radical shift. The next challenge is to optimize these devices in ways we’ve never even thought about before, and that will vary by market. So what are the pharma companies and researchers in health care really asking for?
De Boeck: That’s correct. It used to be PPAC — power, performance, area/cost. Currently, we’re adding ecology and sustainability. And you want to understand how the circuits are made, but PPAC doesn’t say much about which algorithm you’re using. And so power and performance will depend on the choices you make on the algorithmic level. With health care, our medication in the future will be very personalized. That is a major shift. Today we are taking medication that works for a small portion of a large test community, and which doesn’t hurt the rest. It’s almost medieval. If you go to personalized medicine, you will need very flexible production, as well as high throughput because of cost. But you also will need high precision and maybe a control mechanism that will be extremely challenging for manufacturing. So we may go to smaller bio reactors that manufacture a personalized medication. There may be a dozen sensors required around such reactors. That requires process analytics, which today is done for a sample size in the bioreactor, and if it doesn’t fit the specs you dismiss the whole vessel content. That won’t be an option in the future, and as a result we’ll need to raise the level of process analytics and control in pharmaceutical and biotechnology production. And for cell therapy, it will be critical because if you inject one wrong cell, the patient may die. There’s a lot of work to be done in testing and supporting that vision of individualized, personalized therapy.
SE: Let’s dig under the surface here. We’ve been able to take AI, machine and deep learning, and optimize it for a specific use. What we have not been able to do is say, “It’s optimized now, and it will continue to optimize if your body changes.”
De Boeck: We’re looking at processes that are still distant from humans. When somebody makes a diagnosis, they have to make a fast decision to deliver personalized therapy, whether that’s medication or something else. When you’re looking at cancer, which is a very complex disease, in many cases the tumor evolves. You’re shooting at it with precise weapons, like a very specific targeted medication. That tumor may duck and resurface, so you need to follow the patient through remission to find that hidden enemy that may be lingering somewhere. We need to shorten the loop between the medical check-ups and the patient-oriented action in many cases, like for blood sugar monitoring or a pacemaker. You want to be able to monitor a patient’s response to a certain therapy, and to be able to change that therapy as needed. That’s coming. But it requires a very, very detailed interplay of diagnostics, the constantly enriching insights in optimum personalized therapy, and ways of administering it.
SE: What happens in terms of medical training? A doctor is used to making diagnoses using all sorts of analog skills. Now you’re taking an analog world and infusing some digital processing in there. Should the understanding of these new tools and their capabilities be part of the curriculum for medical students?
De Boeck: Yes. When a doctor walks in to see a patient, they typically ask questions like, “Did you sleep? What was your diet? Are you under stress?” That’s going to be detailed much more precisely in the doctor’s digital data sheet before a patient walks in. The doctor will still have the dialogue – there is always going to be the human factor — but they can simultaneously benefit from the actual measurements. When something is very clearly diagnosed, you need to understand the impact of therapies like, for instance, the impact of chemotherapy or radiation on other organs. General practitioners are not expert in every single organ, and specialists may not always have a holistic view of the patient. Medical practices are evolving rapidly by making a systemic, holistic view a reality. And that’s where AI and data, and making it more science than art, will come in. We still will need many highly skilled medical professionals, but they will be trained differently and find technology to assist and suggest, even more than today.
SE: We’re getting faster, more accurate, and much more granular than what we’ve been able to do in the past, right? That’s really what the technology offers.
De Boeck: Yes, but more granular with a view of impact on the whole system.
SE: So are you looking at whole-body simulation, like we do for chips, on a massive scale?
De Boeck: Yes. That’s where the concept of digital twin comes ins.
SE: Going back to the pacemaker example you gave, today it’s extremely difficult to replace a battery. We’ve had energy scavenging technology capable of providing enough energy for a full duty cycle for years. Why hasn’t that been adopted? We’re still seeing surgeries where people pull out the batteries and put new ones in?
De Boeck: We’ve had a energy scavenging program ourselves for about a decade. We gave up on that partly because energy scavenging inside a human body is very difficult and could not outperform the use of battery technology. The reliability was a tough issue to tackle. Vibration is a very good way of harvesting, but unfortunately not a very good one in the human body. One cannot easily get the power and reliability to the level that one would be comfortable implanting. The technology may still evolve, but we don’t put a lot of emphasis on it. External powering has become quite relevant. We’re looking at new ways of doing it, too. We also have a smart pill program where we double down on miniaturizing the ultra-low power wearable technology. Right now, power is an issue because the size of the pill will basically be the size of the battery, so we need to address that. Beyond that, the next big challenge will be changing practices in the clinical context, which is extremely hard.
SE: We obviously need to get smaller chips with much lower power. These are fairly pricey units. Will we get to the point where we have a basic platform with interchangeable chiplets, or will each design continue to be unique?
De Boeck: We’ve tried to build sensor platform, as many others have, where you integrate as much functionality as you can to make it a multi-purpose vehicle in order to bring the cost down. We have many sensor functionalities, with some sensor fusion on a processor core. We’ve seen it adopted in certain smart applications. The hard part will be building a roadmap, where the reuse of many of the components or chiplets could make the market accessible. There needs to be high gain or profit or return on what you measure. You need to be clinically convinced that what you do measure is instantaneously leading to something that will help a person, from life-threatening to life-changing or to supporting life. This means that it could be a companion device with a very personalized drug. Just imagine the drug you need to take is so costly that reimbursement schemes are prohibitive. But if you can build a platform that helps these expensive drugs be tested first — like an organ-on-chip type of approach, so when it’s administered it has a clear benefit — that will be really important to proving that a personalized drug is doing what it needs to do.
Leave a Reply