Digital Twins Gaining Traction In Complex Designs

Improvements are still needed in integration of data and tools, but faster multi-physics simulations are becoming essential for context-aware optimization and reliability.

popularity

The integration of heterogeneous chiplets into an advanced package, coupled with the increasing digitalization of multiple industry segments, is pushing digital twins to the forefront of design.

The challenge in these complex assemblies is figuring out the potential tradeoffs between different chiplets, different assembly approaches, and to be able to do it quickly enough to still hit market windows. This includes the obvious power, performance, and area/cost tradeoffs, but it also increasingly includes mechanical engineering, the impact of various manufacturing processes on different materials, and the need to both zoom in and zoom out on different segments of a design.

“Think of the transformation that the cell phone has brought into our lives, as has the present-day migration to EVs,” said Sherry Hess, senior product management group director for multi-physics system analysis products at Cadence. “These products are not only feats of electronic engineering, but of mechanical engineering, too. Electronics find themselves in new and novel forms, whether a foldable phone or a flying car. Here, engineering domains must co-exist and collaborate to bring about the best end-product possible.”

All of the top EDA companies have embraced multi-physics as a way of leveraging their tools beyond just chips, with a heavy focus on large-scale simulations. “What about drop testing, aerodynamics and aero-acoustic effects? These largely CFD and/or mechanical multi-physics phenomena also must be accounted for,” Hess said. “Then, how does the drop testing impact the electrical performance? The world of electronics and its vast array of end products is pushing us beyond pure electrical engineering to be more broadly minded and develop not only heterogeneous products, but heterogeneous engineering teams, too.”

Connecting physics processes and data together within the user ecosystem is known as chaining. “In automotive design, chaining in a welding-to-crash simulation means that we can do the very fine simulation of the welding process, addressing different parameters, and take into account the result of the simulation into crash in order to see if the weld will fail in a crash,” said Emmanuel Leroy, chief product and technology officer for the ESI Group at Keysight. “By doing this we can adjust and refine the type of weld we want to use, the process parameters, the number of spot welds. This is breaking silos in OEMs organizations, and it includes concurrent engineering concepts such as how we do manufacturing and engineering together.”

A number of industries, like automotive, have extensively used digital twins to build mechanical systems. “What’s changing is that people are transforming all of this into products that use electronics,” said Marc Serughetti, vice president, product management and applications engineering at Synopsys. “They use electronics because it’s more efficient from an energy standpoint, it’s safer, or it has the ability to be upgraded. All of those products are evolving toward electronics, and more importantly, toward software-defined products. That gives them the ability to change the capabilities, and to bring in new capabilities along with this entirely new business model. That’s the trend in the market — software-defined products. So thinking about the digital twin in that context, if all of this is done with electronics, why don’t we have a digital twin part of the electronics? Then, how you bring those things in becomes really important. All of this pertains to a market that does control systems, in which you’re looking at something in the physical world and trying to control it. If you want to validate or understand what’s happening, it means you have to simulate the electronics in the context of the system.”

Digital twins also open the door to architectural exploration, whereby engineering teams can examine multiple things simultaneously to see how they behave together, and what happens if something is changed in one part of the architecture.

“The last time we had a physical representation in EDA was Rubylith, but since then everything’s virtual and digital,” said Neil Hand, product marketing director at Siemens EDA. “We’ve gone from domain-specific digital twins to more inclusive ones, and we’ve linked in manufacturing and gotten bigger and bigger. Now we’re starting to say we’ve got a chip-level digital twin, then we’ve got the 3D-IC, which is starting to bring mechanical and thermal into it, and we’re going to get to the product level. You can start to look at the tradeoffs. You take a best-case scenario today in which someone wants to do a custom semi for a unique application. You’re a systems house, you’ve got all the money in the world, you’re patient. You have an architecture, you’re writing up a spec, giving it to your IC group or to a different IC group. You’re plodding along, going down the V, and everyone’s doing their implementation. Then you’re putting it all back together and hoping it works. Once we have a more connected, fully interoperable digital twin, you can start to make tradeoffs.”


Fig. 1: Product lifecycle management using digital twin approach. Source: Siemens EDA

This is a massive shift, enabled by more and better data, coupled with significantly more compute power. “This is going to be an evolution — but a revolution at the same time — because now we have to bring the electronics in, and suddenly the electronics are 50% of the product,” Serughetti said. “You can’t ignore that part, and connecting those two worlds in some type of system is essential. We find this in automotive. Today, everybody’s talking about automotive, but it also happens in aerospace and defense, as well as industrial. In an electronic product, there are three things you’re doing. You’re controlling something, you’re sending information, or you’re communicating that information to somebody through a UI. Those are the basic functions, and we have to do this in a safe way in a secure way. How do you validate all those things, especially the electronic parts, in that context?”

Digital twins and shift left
EDA companies agree that digital twins are necessary for progress in multiple industry sectors. “You have to have digital twins,” said John Ferguson, director of product management at Siemens EDA. “You can’t get there blindly without it anymore, and you can’t do it without a shift left approach. They go hand in hand. When it comes to system design —particularly in 3D, and the aspects of 3D where you have multi-physics involved — then it gets especially tricky, because you have interdependencies among everything.

The basic concepts here are not new. “We’ve had this concept of digital twin for some time, but they’re really scoped as individual aspects that can be treated independently, and now we can’t do that anymore,” Ferguson said. “Thermal impacts stress. Stress and thermal impact electrical behavior. But electrical behavior impacts thermal. That puts an extra onus on digital twins, and you can’t avoid having to do this view of the world where you understand the intricate interplays and account for them as you go. This also implies the shift left aspect of it. It’s not just a ‘one and done’ anymore. You have to think of it all together. That’s the whole nutshell we’re trying to thaw here.”

Cadence’s Hess gave the example of the latest push for higher-performing HBMs and AI data center expansion. “These high-bandwidth memories are growing from several layers to 12. These layered electronics are powered, and power creates heat. Heat needs to be understood, and thus thermal integrity issues uncovered along the way must addressed. But the electronic-thermal issues are just the first domino in a chain of interdependencies. What about the thermal stress and/or warpage that can be caused by the powering of these stacked devices? And how does that then lend to mechanical stress and even material fatiguing as the temperature cycles high and low though the use of the electronic device? This is just one example in a long list of many.”

Digital twins also will be essential for the adoption of chiplets, where different chiplets can be swapped in and out to determine the impact on the behavior of a multi-chiplet system.

“New multi-chiplet designs are being developed now for the next generation of AI accelerators, CPUs, and networking chips,” said Tony Chan Carusone, CTO at Alphawave Semi. “These designs integrate the latest CMOS logic technologies with memory and connectivity chiplets, and sometimes additional peripheral chiplets, all within the same package. They push the limits of thermal heat dissipation, signal integrity, power integrity, mechanical reliability, and logic performance. Each factor can interact with and affect the others, making it challenging to optimize a design. For instance, an improvement in signal integrity might compromise mechanical stability, or repartitioning logic across chiplets for performance gains might lead to localized heating issues, affecting reliability.”

What’s important to note is that digital twins are part of a dynamic process. “It’s not a single step,” noted Lang Lin, principal product manager at Ansys. “In a chip simulation, we get all the data from the simulation. But for the manufacturing process, it’s different. You have your substrate ready and you start to solder another one on top of it, but you increase the temperature to probably 300 degrees to attach the two chiplets to each other. Then you cool it down. Then there’s an annealing process, down to perhaps minus 40 degrees. Then the next step comes in, you put more die on top of the other. Digital twin is a great concept to emulate each step one-by-one. It has to capture the status at the end of each step and use that as the initial condition for the next step, and that challenges our traditional simulation tools.”

This is very different from the traditional simulation for a mechanical process, like a car or an airplane. “You have the mechanical part assembled together and the engine starts working, so you’re going to see if there is any warpage or if there are any mechanical failures,” Lin said. “But that’s the old days. In the old days we were dealing with more than millimeter size. It was a meter, or many meters. Now we’re moving down to nanometer sizes, and that’s cause for novel mechanical modeling approaches and new material science to build a solution. The state-of-the-art mechanical simulation right now can be zooming into the micrometer scale. Let’s say two dies connect to each other. Maybe the bump pitch is about 40 micrometers. You model the bumps, or micro-bumps — thousands or millions of them. You could see the connection problems by building the whole model in micrometers. In the next five years, the problem goes inside the die, where you need to see the structure of Metal-1, Metal-2, Via-3, for example, and here’s my tiny transistor. It’s that level of nanometer scale. We work closely with foundries to enable that kind of mechanical simulation. This is purely at the cutting edge. At the end of the day, you expect a structural model of vias, TSVs, and wires that could be seen from your tiny little simulation engine, or the GUI. You will see those structures.”

EDA tooling for digital twins/system co-design
To make all of this work requires some changes in EDA tooling. How connected are EDA tools today to realize the full capabilities of digital twins/system co-design?

Synopsys’ Serughetti said the first thing to look at in this context is what is needed, and what is the problem to be solved. “If I had a super-fast RTL simulation that allows me to run Android or another software stack, and execute that in 10 seconds to boot Android, how great would that be? Unfortunately, it’s not the reality,” he said. “We’ve been in simulation for many years, and we all know that there are two paradigms — abstraction and performance. The two don’t like each other very much. If it’s too accurate, it’s not fast enough. If it is fast, it’s not accurate enough. If you look at the type of technology, that’s going to range from an RTL simulation that has its own purpose, emulation, all the way through to a virtual ECU, where the hardware can almost be abstracted to just receiving a CAD message. That’s what I care about. It’s a representation of the hardware saying the scan, which is very abstracted, can run really fast. But for the engineer who’s trying to see if there’s a problem in the CAN driver or the CAN interface, that’s not going to be good enough. That’s not accurate enough.”

The solution is better data integration. “There’s a slew of technology that exists in this area that is starting to be connected,” Serughetti said. “As such, use cases are becoming extremely important. You’re going to have people who look at performance and power validation. How is this doing? You’re not doing this on a very highly abstracted model. You need things like an emulation platform to do that type of validation — to run enough software and be fast enough. And another engineer, who’s testing the application software in that context, doesn’t really care about the underlying hardware. So they can go to a software-abstracted representation of the electronics.”

To make all of this happen, the tools need to work together, too.

“This is the crux of it, because these things are so intertwined,” said Siemens’ Ferguson. “We don’t have a way to solve it once. You have to iterate through. We have to do a multi-solve approach every time, which makes it very challenging and very difficult. How do you do that? How do you do that in a way that’s consistent across all aspects of what you need to look at, at once? It’s quite daunting.”

Ferguson sees the semiconductor design ecosystem doing a better job of recognizing what is still outstanding, but the integration needs improvement. “How do we tie them together? And how do we get to a point where everybody agrees on how to tie those things together? If everybody has a solution, but every solution gives a different answer, then we’re not in a good place. Still, all of the EDA providers are putting together their solutions. But it’s difficult to understand what is the right answer.”

Typically, chipmakers compare the results to a product that has been in the market for some period of time. “But just because you’ve been using it for a while and you haven’t seen gross failure from it, doesn’t mean it’s accurate,” Ferguson said. “It’s a very tricky situation at times. You have to decide what is the golden. You can do some things with measurements on silicon lots of different ways. You can measure temperatures on the silicon, you can do things to measure stresses on silicon, you can measure the electrical behaviors on silicon. But all of those have inaccuracies built in, because of how your chip behaves and fluctuates across the wafer. You might get a bad lot. How do you know that’s the one to go off of, versus a different chip on the same wafer or in a different batch of wafers? It’s very tricky. We’re all in the situation where this is something relatively new, so how do we know? We all may get probably not super different answers, but different nonetheless. How do you decide which one you’re going to rely on? I don’t know how we solve that aspect of things. Ultimately, it comes down to your manufacturer. What did they decide is their reference or golden? You’ve got to trust they’re doing the right work, and whatever tools meet their requirements of accuracy and are certified, go with that but still be prepared. There might be something that we all missed along the way, and we go back to the drawing board one more time.”

The big picture
The chip industry recognizes the value of digital twins, particularly at advanced nodes and in heterogeneous assemblies tied to specific domains. The challenge now is optimizing all of the tools to work together faster but with the same or better accuracy than in the past.

“We see from customers that it’s more collaboration between simulation and AI,” said Keysight’s Leroy. “We talk a lot about hybrid AI and hybrid digital twins and accelerating simulation, but the democratic decision of simulation is really coming from a mix of AI and simulation. We don’t want to start from zero like in the past, where you take big data, you put in a lot of AI, then you don’t need any physics. You want to have smart data, so you need the physics, and you augment that with data and AI to make the right decision.”

This all happens at the confluence of electrical, mechanical, and CFD. Alphawave’s Chan Carusone believes that with the full realization of digital twin technology, it will be possible to take a holistic view of complex chiplet designs, enabling the co-optimization of cost, power, performance, and reliability. But between now and then, there is a lot of work required to make this a reality.

Related Reading
Toward A Software-Defined Hardware World
New approaches to software-defined hardware involve a rethinking of model-based systems engineering.
Can Models Created With AI Be Trusted?
Evaluating the true cost and benefit of AI can be difficult, especially within the semiconductor industry.



Leave a Reply


(Note: This name will be displayed publicly)