How does the digital twin translate to different industries?
There is a new buzzword in town, “digital twins.” I have been using it for a while now in the context of system-on-chip (SoC) verification as well as a little more broadly when it comes to security issues for data in general. There are some differences in emphasis across different vertical domains, based on when they are used during the life cycle, which use models are desired and what scope of data is to be consumed. Depending on the combination of these, the fidelity, capacity and speed of the underlying tools really matters. Let’s try to decipher this situation.
About a year ago, I had a discussions with Brian Bailey on this topic (“Digital Twins Deciphered”) and had pointed him to the definition of “twin” and “thread” as found in an aerospace article by David Grasso at Capgemini. One can probably up-level this to a digital twin being “the virtual representation of a physical object or system across its life cycle.”
At Cadence, our Intelligent System Design strategy is targeting eight vertical domains—consumer, hyperscale, mobile, communications, automotive, aerospace and defense, industrial and health. Looking into each vertical in more detail, the prominent examples vary widely, but also show some patterns.
While on first sight, it seems like there are a lot of different types of digital twins, on closer examination, they have a lot in common. At the core, there is the physical object, its virtual digital counterpart and the data that connect them. There is overlap in use models, such as predictive maintenance for humans and machines alike. The object’s environment can become a digital twin, like the hospital operation dealing with a patient or the air environment dealing with an airplane. Given that the scope in terms of system complexity, and the amount of data that must be managed varies widely, the model of the object needs very different representations in terms of fidelity.
For instance, a 5G endpoint like a cell phone will be represented by a much smaller dataset and much more abstract simulation within the network for load balancing. In contrast, for detailed thermal and power analysis, dynamic data from detailed descriptions of a chip within its system environment, like emulation of Verilog descriptions of a SoC, are required to drive detailed bottom-up power analysis from technology information at the .lib level.
Combining both, such as trying to utilize a detailed representation of an SoC that is primarily used for verification and software development, within a digital twin environment that tries to predict aspects of a system over years of its life cycle, is not practical. The intended use model is simply too different.
While hybrid use cases, like verifying a new SoC within its bigger system environment, surely will emerge in the future, the right combination of model fidelity is crucial for proper execution. I have described some of this in the past in “Digital Twins For Hardware/Software Co-Development” and in a paper at GOMACTech 2019 (“System Emulation and Digital Twins in Aerospace Applications”)
And how does EDA fit in? I was part of an interesting roundtable with Ann Mutschler at DAC 2019 called “Are Digital Twins Something for EDA to Pursue?” My answer was a clear “Yes, it does.” There are lots of aspects within core EDA already, while others are adjacent growth areas for EDA.
For one, a lot of the design representations used during the development—see the graphic associated with this blog post for some of the verification tools—are the foundation for more abstract models used later during the product life cycle. They can also be “re-activated” to reproduce aspects happening during bring-up or even after shipment to reproduce defects that are found in the field.
It’s a brave new parallel world of digital twins. Many aspects still need to be figured out. Combining AI with all of this is a natural next step. Many aspects of privacy, security and safety need to be worked through, otherwise we are at risk to experience the “Revenge of the Digital Twins” that I wrote about a while back. The upside in productivity and new ways to optimize products during their life cycle, and to make them safer with predictive maintenance, however, clearly outweigh the risk.
Leave a Reply