Systems & Design
SPONSOR BLOG

Making Sense Of EDA And Digital Twins

How does the digital twin translate to different industries?

popularity

There is a new buzzword in town, “digital twins.” I have been using it for a while now in the context of system-on-chip (SoC) verification as well as a little more broadly when it comes to security issues for data in general. There are some differences in emphasis across different vertical domains, based on when they are used during the life cycle, which use models are desired and what scope of data is to be consumed. Depending on the combination of these, the fidelity, capacity and speed of the underlying tools really matters. Let’s try to decipher this situation.

About a year ago, I had a discussions with Brian Bailey on this topic (“Digital Twins Deciphered”) and had pointed him to the definition of “twin” and “thread” as found in an aerospace article by David Grasso at Capgemini. One can probably up-level this to a digital twin being “the virtual representation of a physical object or system across its life cycle.”

At Cadence, our Intelligent System Design strategy is targeting eight vertical domains—consumer, hyperscale, mobile, communications, automotive, aerospace and defense, industrial and health. Looking into each vertical in more detail, the prominent examples vary widely, but also show some patterns.

  • In Health applications, digital twins seem to be focused on operational aspects like hospital processes, digital data as it relates to a patient, but also modeling of the human body to optimize surgeries (Blue Brain Project, Living Heart Project). This relates back to the dataset about a patient—for instance, understanding specific blood pressure patterns, obviously much more critical than what I already am monitoring on a personal level, like my sleep and pulse during the day and during workouts, combined with steps.
  • In Industrial applications, we are focused on the industrial internet of things, often referred to as “Industry 4.0” (see my Embedded World write up). Like in health applications, digital representations of the objects themselves are used, as well as processes to develop them (like an industrial fab). Also of importance is managing the data needed to optimize the process of production and then later the device operation, which allows predictive maintenance. The analogies to health aspects are pretty obvious.
  • In Aerospace and Defense applications, the focus seems to shift a little towards the data collected during the actual life of a physical object, and their application to a digital model of that same object, which allows the optimization of its use, and again, as in industrial applications, can used predictively to estimate when parts are replaced. Safety is another aspect, particularly for certain dangerous tests that can be done virtually in a safer fashion. Even a flight training simulator can be considered a digital twin.
  • Similarly, in Automotive, the application of digital twins spans from development and production to optimization (think sensors in racecars) throughout the life cycle in which car manufacturers collect data about your car to know when part might fail and when you are due for your next inspection.
  • In Communications, the network for fixed and wireless connections is replicated as virtual models. Balancing can be optimized by simulating the number of end users like you and me as we are using our devices for video, audio, apps etc.
  • In Mobile, the device development itself is heavily supported by digital twins, specifically with virtual prototypes for software development and detailed modeling to understand 3D solving and power/thermal aspects.
  • In Hyperscale applications for data centers, again the thermal and power aspects within racks, for instance, can be modeled, analyzed and optimized. In addition, the data center topology can be virtualized and modeled in a digital twin, including the IT, the racks, the cooling and the power distribution.
  • Consumer applications are perhaps the most fun digital twins when taking into the account the consumer portion of the IoT, allowing gamification of step counting, virtual avatars, etc. Also, a virtual Michael Jackson in a Cirque de Soleil production qualifies, as well as TV hosts that are literally joined by their digital representations. Thinking big, digital twin models can extend to full cities, like this example of Singapore.

While on first sight, it seems like there are a lot of different types of digital twins, on closer examination, they have a lot in common. At the core, there is the physical object, its virtual digital counterpart and the data that connect them. There is overlap in use models, such as predictive maintenance for humans and machines alike. The object’s environment can become a digital twin, like the hospital operation dealing with a patient or the air environment dealing with an airplane. Given that the scope in terms of system complexity, and the amount of data that must be managed varies widely, the model of the object needs very different representations in terms of fidelity.

For instance, a 5G endpoint like a cell phone will be represented by a much smaller dataset and much more abstract simulation within the network for load balancing. In contrast, for detailed thermal and power analysis, dynamic data from detailed descriptions of a chip within its system environment, like emulation of Verilog descriptions of a SoC, are required to drive detailed bottom-up power analysis from technology information at the .lib level.

Combining both, such as trying to utilize a detailed representation of an SoC that is primarily used for verification and software development, within a digital twin environment that tries to predict aspects of a system over years of its life cycle, is not practical. The intended use model is simply too different.

While hybrid use cases, like verifying a new SoC within its bigger system environment, surely will emerge in the future, the right combination of model fidelity is crucial for proper execution. I have described some of this in the past in “Digital Twins For Hardware/Software Co-Development” and in a paper at GOMACTech 2019 (“System Emulation and Digital Twins in Aerospace Applications”)

And how does EDA fit in? I was part of an interesting roundtable with Ann Mutschler at DAC 2019 called “Are Digital Twins Something for EDA to Pursue?” My answer was a clear “Yes, it does.” There are lots of aspects within core EDA already, while others are adjacent growth areas for EDA.

For one, a lot of the design representations used during the development—see the graphic associated with this blog post for some of the verification tools—are the foundation for more abstract models used later during the product life cycle. They can also be “re-activated” to reproduce aspects happening during bring-up or even after shipment to reproduce defects that are found in the field.

It’s a brave new parallel world of digital twins. Many aspects still need to be figured out. Combining AI with all of this is a natural next step. Many aspects of privacy, security and safety need to be worked through, otherwise we are at risk to experience the “Revenge of the Digital Twins” that I wrote about a while back. The upside in productivity and new ways to optimize products during their life cycle, and to make them safer with predictive maintenance, however, clearly outweigh the risk.



Leave a Reply


(Note: This name will be displayed publicly)