Are Digital Twins Something For EDA To Pursue?

Part one: Defining the digital twins concept; the trouble with models; the issue with the ecosystem.

popularity

‘Digital Twins’ are one of the new, fashionable key concepts for system developers, but do they fit with EDA? How many different types of engines do these twins run on – abstract simulation, signal-based RTL simulation, emulation, prototyping, actual silicon? What should the use models be called for digital twinning – like reproduction of bugs from silicon in emulation? Or optimizing the implementation usage using data run safely in virtual and FPGA based prototyping? Semiconductor Engineering sat down to discuss these issues with Wade Smith, applications engineering manager for high frequency electromagnetics at Ansys; Dave Kelf, chief marketing officer at Breker; Frank Schirrmeister, senior group director of product management at Cadence; Jean-Marie Brunet, senior director of marketing for emulation at Mentor, a Siemens Business; Raik Brinkmann, CEO at OneSpin; and Marc Serughetti, senior director, product marketing, automotive verification solutions at Synopsys. Mike Gianfagna, vice president of marketing at eSilicon, also weighed in after the fact. What follows are excerpts of that discussion.

SE: How does the industry look at the concept of digital twins today? Why is it necessary, and how should we implement it?

Smith: My imagination on digital twins is this: let’s consider a 5G system. If you ponder this a bit, it’s not only the chip, but it’s the digital baseband with modulation systems, all the way back to the surface sections. In other words, the edge compute sections, the big cloud systems out through front end antenna systems, environmental systems. It’s the concept of trying to look at that whole thing and monitor one big system and control that 5G system through, potentially, a digital network. Just looking at the antenna portion, if you have a base station talking to micro cells around an urban environment, what’s the best placement of the microcells? How do you know if they’re transferring data correctly? How do you know if one goes out and the others don’t? If you know it’s a lot of money to send a technician up to fix an antenna on a base station, how do you do predictive maintenance on that base station antenna? It’s that type of environment, a full digital twin of a 5G communication system is where I view how big things can get and that’s not including multi physics.

Brunet: The timing [of this] is very good, because we just announced something around that space that is targeted first for automotive, which is a pre-silicon verification environment. The digital twin concept is very real at Siemens, and we see this as an opportunity as soon as you have to do any system of systems, where you have anything that senses something; it doesn’t have to be in the automotive space. Anything that has a sensor, any decision making or computation and actuators. We started this before we were acquired by Siemens, and we’ve continued to acquire companies like TASS for the PreScan software. We have all the ingredients to do a full story in the digital twin.

Brinkmann: We’re an IC integrity verification company. IC integrity means functional correctness, functional safety and trusted security. These three things go together and make a system actually what it’s supposed to be on the functional side and in different dimensions. When I think of digital twin, what comes to my mind is a car as an example. When we look at the way we are doing these chips today, the differentiation will go through customization, this is the next big trend. IP configurability has been happening all over the place but what’s next is customization. Customization goes with the digital twin because if you have that high degree of customization, you have a very high degree of variance in the market. If you’re deploying systems, every one will be different. So when you look at safety, for example, with security, specifically, inevitably, there will be a point where you will have a breach of that property at some design component, and what you need to do is understand which of the systems in the field are actually impacted by that. The digital twin is actually not just a model of a general car or a general thing, it’s the image of ONE physical car in your system, and in order to track all that you need to start with technology that that allows you to trace that.

Schirrmeister: Virtual twins, in my mind, are right in the trajectory of EDA, but should we still call it EDA? It’s just system design. That’s why for me, this notion of, ‘Is it for EDA?’ I think EDA is part of it. But if you look at what we do with intelligent system design, it starts with core EDA, you need the core verification engines, it expands into software, into system design activities, which include multi-physics, and then it all goes into this pervasive intelligence piece, which is including AI for our tools, but also enabling AI. So now digital twins, to introduce Marc here to my right, we used to be colleagues, and the first time I saw a digital twin’s value really big was when there was a press release with Mazda in the automotive space about virtual prototyping. They didn’t want to have their test drivers be exposed to the extreme corner cases that a car needs to be able to deal with. That’s the first time I really realized we need a complete digital representation with which you can do this stuff you really don’t want to do in the virtual world. I agree with Jean-Marie, it’s here to stay. It’s not something that is just the latest trend. What’s fascinating is it’s elevating EDA at the core into new areas, and I don’t feel that anybody can do it by themselves so there’s an ecosystem of people, I’m making lots of new friends. On the software side, I’m working with people who can do full system simulation, aIbeit at a much higher abstraction, but then they need the detail so we’ll plug one of our emulators into it at the lower level and have mixed fidelity approaches and so forth. So I think it’s here to stay. And it has different applications. Safety, like Raik and Wade said earlier, and then even just for pure verification, the digital twin we have in emulation and in system and small environments or in prototyping helps users to take it back from silicon and go back in front, debug it and figure out what went wrong. So it’s at different levels.

Serughetti: There are a lot of things that I would repeat on what has been said but I would like to add a couple of things. So the first thing I like in the title is digital twins, twins being plural, because there’s no concept of one digital twin that will serve everybody, I think that’s very important. We talk functional safety, electromagnetic software, everything will need defined as well. The key in digital twins is not really the digital twins — it’s the question you’re trying to answer with the digital twin, and I think that’s what’s important. That’s what drives what the digital twin is. The value of the digital twin for the industry is relatively clear: it’s about shifting the development to the left; it’s about doing things earlier, better without having to create real prototype where you can put people in danger when you talk about safety, or as you go towards larger system, things you can not actually build. That to me is digital twin. To really think of digital twin, the first thing you have to do is really think about what is the design or the verification question that I am trying to answer with that digital twin. Digital twin has existed for a long time in individual domains or sometimes multiple domains but I think with the trends we see all the systems being interconnected, and communicating with each other. The problem of designing and verifying the system has become even bigger, and as a result, the digital twin is natural to be there and to continue.

Kelf: Between all these definitions, we’ve got it pretty covered. I want to go back to what Wade said about 5G systems. One big customer of Breker is Huawei, and the first time I tripped over digital twin was with some of their big basestations and the basebands working on there. What they were trying to do is model, not just the platform device itself, but the entire environments around it, and in the wireless world, that’s quite a meaty job. In fact, it’s harder to model the environment than it is to actually model the thing, but you absolutely need it. What they were doing is producing essentially a virtual platform of this baseband with a complete environment around it and the environment would model all kinds of atmospheric effects, etc. Then when something went wrong in the field, they would actually tweak this environment to see if they could duplicate what was happening, and then find out where the bug was really quickly rather than sending someone out to the Sahara Desert with a toolkit to try and tweak what’s going on in this thing. But what’s interesting, and what they were talking to us about was how do you actually model digital twins? Twins is the operative word, there’s multiple different twins. What they were doing is looking at the Portable Stimulus side, of course, and trying to determine if they could use Portable Stimulus as a specification model, and use that to model a twin and then driving that into the virtual platform implementation and so on, and using that to sort of set up an environment around it to do the model. How you model it, and what you modeled and the fact is, the environments as well as the actual platform itself is absolutely critical.

Gianfagna: At eSilicon, we see a scenario in our IP business that has elements of a digital twin strategy. Whenever we build a new, high-performance and complex IP block, a new SerDes for example, we will tape out at least one, sometimes several test chips to validate the performance of the design over operating conditions and stress factors. These test chips become the “silicon source of truth” for us. We also deliver them to customers for their validation of the performance of our IP in the context of their system. We recently conducted a webinar that describes how we built a high-precision test board to allow customers to test our SerDes in their system. The digital twin concept comes in when someone tapes out a chip that uses our IP. During the bring-up phase, there are often anomalies to be debugged. If a piece of IP is suspected, we can go back to the test chip, set it up to emulate the conditions of the actual chip and take a closer look at what’s going on. This helps to isolate potential issues. We do this with partner IP as well as our own, by the way. This process helps a lot in the bring-up/debug phase of a new complex ASIC.

Smith: I like the digital twins comment that was made because the digital twin has been a giant push for Ansys, because it does link a lot of the multi physics together. But it also links it a lot with our PLM software and all the data management and the model management that comes into this. I’ve heard we sometimes call it ‘Digital Family,’ because I think we’ve been doing digital twins. Back in a previous life, we used HFSS and analyze a multi band antenna. Well, that was a digital twin because we’re going to build it, shove it in an anecdote chamber, and measure it.

Source: ANSYS

Schirrmeister: It’s interesting how it was said it’s really the plural, which is important, because there’s a wide spectrum. Whenever I talk to somebody about digital twin, I always have to level set what we actually mean. It has real applications like for health, or in a machine, like the health off an airplane. There are cool demonstrations of a digital twin of the ADAS piece where you basically see virtually how it drives, and it’s better to drive through red lights and kill people virtually, than do it in real life while you are verifying the system. But then it goes all the way down to the digital twin, where then things like formal technologies, Portable Stimulus for the chip play a role, because it’s a digital twin for verification, where moving back and forth, I replicate the data, same principle, replicate the data of the real thing (the chip) in my verification environment earlier on, shifting left as Marc said.

SE: How much of this is really embraced today?

Brunet: We have actually worked a lot on this, first in the automotive space. Digital twin, I think is very much embraced, the concept. Although at the OEM level in automotive, they don’t have a clue how to do it yet. I think the biggest issue in this space is models. From the IC domain, we understand simulation of the model abstract very well, but in this domain — system of systems — from an OEM to an ECU to the chip guys: chip guys understand, but the others don’t understand models very well. It doesn’t mean the same thing for them. And they don’t exchange models. So I think digital twins is a wonderful concept. It’s being embraced. It’s an ecosystem challenge but they do not exchange models at all.

Serughetti: Today, they do exchange models.

Brunet: They are going towards…

Serughetti: I think for many years, you can see OEMs driving the requirement for models through the supply chain.

Brinkmann: You have to have some sort of model that you can exchange between these upstream companies.

Serughetti: Yes. It goes back to different domains at different levels of the digital twin, I think. If you start looking at virtual prototyping type of technology, that has been going on for many years already, the semiconductor to the Tier 1. If you look at OEM towards Tier 1, if you think about Simulink type of tools, those have been there for many years. There are RFQs coming down from the OEM that don’t leave you any choice, you have to deliver models. What we see it’s again in domain, and then the problem is how do you start bringing all these together?

Brinkmann: One interesting point is what changes. We have been using virtual models for a long time. But the game changer is that the lifetime of the virtual model is extended to the lifetime of the actual deployed system. That’s the big difference. And that means for EDA, it’s important because our tools will be used over the lifetime of that product. It’s not just a design tool, it’s a lifetime thing. It’s like product lifecycle management, and verification, as an example, becomes the life cycle activity. And that’s a game changer for us in the EDA industry, because we can deploy our solutions for a much longer term than just the time that they are using them.



Leave a Reply


(Note: This name will be displayed publicly)