EDA Looks Beyond Chips

System design, large-scale simulations, and AI/ML could open multi-trillion-dollar markets for tools, methodologies, and services.

popularity

Large EDA companies are looking at huge new opportunities that reach well beyond semiconductors, combining large-scale multi-physics simulations with methodologies and tools that were developed for chips.

Top EDA executives have been talking about expanding into adjacent markets for more than a decade, but the broader markets were largely closed to them. In fact, the only significant step in that space happened in the reverse direction, when Siemens bought Mentor Graphics in 2016 for $4.5 billion. Three things have changed fundamentally since then:

  • More leading-edge designs are domain-specific and heterogeneous, requiring a combination of skills, tools, and methodologies that very few large companies have in-house. As a result, EDA vendors are now in direct contact with those companies, as well as their suppliers, which are increasingly leveraging approaches that have proved successful in chip design and manufacturing, but on a much larger scale.
  • Increasing global competition, particularly from China, is forcing companies to dig deeper into data analytics to optimize their operations, adding more sensors at strategic places in manufacturing, leveraging AI/ML to identify patterns and anomalies in massive data sets to improve quality and yield, and restructuring their internal organizations around data.
  • Increasing digitalization of new and existing industry segments now requires tighter integration and co-design of hardware, software, and packages — and of systems of systems — to achieve optimal performance per watt. This has been talked about for years in chips, with limited success. But as tools and methodologies are aimed at new markets, they are becoming an important selling point.

In the past few years, Synopsys, Cadence, Siemens EDA, Ansys, and Keysight have been actively developing new tools, or acquiring companies with the necessary expertise, to provide multi-physics analysis and simulation — either for on-premise or in the cloud, or some combination of both — with an increasing focus on machine learning to improve results and shorten the time it takes to obtain them. These investments have significantly broadened their tool chains, which now include everything from place-and-route in the context of different physical effects, to weighing tradeoffs between different chiplets and communication schemes, and modeling thermal gradients and mechanical stresses under real workloads running on prototypes.

If they are successful at tapping the total available market (TAM) opportunity, it could fundamentally change the EDA industry. Global semiconductor revenue is expected to reach $1 trillion by 2030, according to McKinsey & Co. In comparison, the electronic systems market is expected to reach $3 trillion, Omdia estimates. And industry experts believe that’s just a fraction of the total opportunity, which could be orders of magnitude larger.

Fig. 1: Compared to chips, the total available market for electronic systems will be at least three times larger, and the opportunity may be much bigger.  Source: Cadence/industry data

Fig. 1: Compared to chips, the total available market for electronic systems will be at least three times larger, and the opportunity may be much bigger.  Source: Cadence/industry data

“This is true disruption,” said Anirudh Devgan, president and CEO of Cadence. “On the IC side, we have digital twins and we have a lot of history of verification. It has to have 99% coverage and 99% accuracy. Otherwise, the chip is not going to work. So we want to bring that kind of spirit to designing digital twins for systems for cars and planes. Right now the coverage is much lower, like 20% or 30%. We can take it much higher, close to 99%, and that can really revolutionize how things are designed.”

It can also impact what is designed. “We’re also trying to do that in biology with our OpenEye [Scientific Software] acquisition,” said Devgan. “That’s especially exciting with CFD, because this new kind of algorithm can make it more accurate. Accuracy in simulation is super-critical, and that can expand coverage. We look at this as a three-layer cake. The middle layer is principal simulation, whether that’s CFD simulation of transistors or molecules, and the accuracy of that physical simulation is critical. That’s based on physics or chemistry or biology. But at the same time, there are two other layers. That simulation can run on accelerated computing. Now, with GPUs and custom silicon, that’s a whole new world. And then, on top of it, is AI orchestration. AI cannot replace principal simulation, but it can really augment it with neuro-physics, and a combination of AI and physics, to really do optimization.”

So why did this take so long? “All these other domains have done their own DA — not EDA,  but what is essentially physics design automation,” said Aart de Geus, executive chairman at Synopsys. “Ansys talks about itself as a simulation company, which is anything that you put on a computer to try to predict the behavior of the next step in some form or other. And so many of these domains have gradually come closer together. But there’s a very good reason why they didn’t intermingle too much. Intermingling is really complex. And if you don’t really need it, that’s okay. If you have physical parts that you optimize, and then you can describe them by their behavior or by their characteristics, don’t do a simulation. It’s way too slow compared to just using the data on the part itself.”

De Geus said that in the 1990s Synopsys looked at PCB design and decided against developing tools in that area because there was very little impact on chip design. “Once you come closer together, and it becomes smaller and smaller, the ‘in-between’ becomes a big consideration for what you do on the chip,” he said. “This is why multi-die is a seminal moment. Suddenly, when you have a chip and you put another neighbor in the apartment above a high rise, and the people below are cooking, the heat in the apartment goes up and the guys above feel it. Proximity only matters if it dynamically impacts what you do. If it’s static, all you need is an equation, not the details.”

Wally Rhines, who as the former CEO of Mentor Graphics that oversaw the purchase by Siemens, agreed. “All the EDA companies want to do system-level verification,” he said. “It used to be that you could verify a chip, and that was tough enough, and so people didn’t do much multi-die verification. But we had decades of thinking that people were going to simulate whole printed circuit boards. It never happened, mostly because there weren’t models available, but also because you could get the product out without doing it. But if you go back on the analysis side of printed circuit board design over the last four or five years, the category of analysis tools has steadily risen, and it has been the principal driver of growth in that category. Thermal analysis and EMI are two areas everyone talks a lot about in regard to system verification. And now, with multi-die packaging, all the major companies are providing tools to help simulate that kind of behavior.”

Alongside of these changes, more devices, tools, and flows are connected to the internet and to each other, and development cycles are becoming shorter and more domain-specific. “There’s a tremendous amount of innovation happening, and the result of that is products are a lot more complex,” said Niels Faché, vice president and general manager for Keysight‘s design and simulation portfolio. “When you think about smart devices, smart cities, smart automotive, smart defense, you name it, they’re all connected — and the requirements are more numerous and much more challenging than before. There are more bytes, smaller parts, and different technologies and materials. That means you cannot continue to develop the products the same old way, where you might have relied more on physical prototypes and iterations. You really have to shift left and look at products in a virtual domain, as well as all the processes and workflows associated with it. You need a virtual representation, and that’s a mega trend we need to be aligned with.”

The federated approach
Integrating these different pieces will be challenging, but EDA is well positioned to both integrate its tools and methodologies into those of large systems companies, as well as to take the lead on large simulations and analyses.

“We are talking to the automotive folks, to aerospace and avionics, and military to some extent,” said Martin Barnasconi, technical director for model-based systems engineering at NXP Semiconductors, and technical committee chair at Accellera. “All of these industries are running out of steam with their current approaches for how to integrate software, processors, hardware, physical, moving up from device level to component level to basically a full airplane or car or military equipment. They invented standards in the last two decades, and in the semiconductor domain we have our own ecosystem with ACL (Access Control List) and ESL (Electronic System Level) standards. Somehow, we need to bring these worlds together to address systems solutions, and systems-of-systems solutions, to tackle these issues in a much more structured, top-down way. They all have their own standards, but they also see a challenge for how they are going to deal with software and processor content and connect their ecosystems to the cloud. Not everything will run on a single CPU in a single farm. The challenge is huge, but the opportunity is great.”

Fig. 2: Simulation technologies and standards in use today in different industry segments. Source: Accellera
Fig. 2: Simulation technologies and standards in use today in different industry segments. Source: Accellera

Underlying these opportunities are vast improvements in compute power, which allow significantly higher fidelity in simulations used in computational fluid dynamics. “High quality regeneration for complex geometries used to take weeks, if not months, to generate a suitable grid for a complex configuration” said Parviz Moin, professor of mechanical engineering at Stanford University, and director for the Center for Turbulence Research. “Imagine that you have a gas turbine engine and the combustor of it. There are holes and bolts and all kinds of complications. But these quality mesh generations can be done in a matter of minutes. So you can run these calculations in a cost-effective way.”

In a recent presentation, Moin showed a slide of flames shooting out of back of an engine, which at first glance look amorphous. But once measurements were overlaid on those flames, it becomes apparent that with enough compute power they can be modeled, analyzed, and subsequently used to determine variations in thrust. These kinds of applications are why all of the big EDA companies are now heavily invested in cloud-based multi-physics simulation technology.

Fig. 3: Mapping differences in combustion for future simulations. Source: Stanford University/Parviz Moin
Fig. 3: Mapping differences in combustion for future simulations. Source: Stanford University/Parviz Moin

“As customer problems become increasingly more complicated, the size of these challenges requires simulation software that can scale to unprecedented levels,” said John Lee, general manager and vice president of the Ansys Electronics, Semiconductors, and Optics Business Unit, in a recent presentation. “In some cases, we are enabling customers to run transient simulations with 28 trillion calculated values.”

To enable that requires advanced packaging, and the advanced packaging requires the same kind of multi-physics simulation that is needed on a much larger scale. Lee pointed to three key challenges for 3D-ICs — multi-physics, multi-scale, and multi-organization. “Multi-physics is the ability to accurately simulate multiple interlinked physical phenomena,” he said. “For example, circuit activity, power consumption, thermal conduction, and air cooling are all tightly interconnected and must be approached as concurrent multi-physics simulations. The latest silicon process technologies, along with the densities of 3D-ICs, have introduced novel physical challenges that chip designers have not dealt with in the past — for example, detailed thermal analysis and the thermo-mechanical stress and warpage of a 3D assembly. This is a significant reliability issue that did not exist with monolithic designs.”

Multi-scale is more about the organizational structure of flows and people. “The advent of 3D-IC has introduced multi-skilled challenges as the lines between three traditionally distinct design functions have blurred,” Lee said. “Designers must now deal concurrently with device IP and chip design at the nanometer scale, interposer and package design at the millimeter scale, and system design at the centimeter scale and up. Delivering a 3D-IC design flow and simulation flow that spans this many orders of magnitude poses big challenges for both the quantity and quality of simulation results. Advanced mathematical techniques, such as reduced order models, AI, ML, and SigmaDVD are needed in order to help manage the tremendous scale and volume of data that is needed. And beyond simple scale, the very nature of physical challenges also is new. For example, thermal conduction tends to smooth out across small regions of the chip. But as we look across interposers, the temperature gradients can induce severe mechanical challenges. So we need to look at scale, and also quality and quantity.”

The bigger challenge, however, may be more organizational and business-related. “The ambition is to move anything from FMI (Functional Mock-up Interface) and hypervisor interfaces, which would be appropriate, all the way through to automated drive systems,” said Mark Burton, director of engineering at Qualcomm and vice chair of the Accellera PWG. “All of us are able to import simulations of different levels of abstraction, whether it be physical or computational components, and have those working together in a reasonable fashion.”

But to really make this work requires data sharing on a massive scale. In highly competitive markets like automotive, military, industrial, and aerospace, that data could be worth billions of dollars. “There are two different planes of activity,” Burton said. “There is the, ‘How do I connect things together, and who owns that connection and the means by which we are going to communicate?’ This is the leading edge of where we are with the working group. The view at the moment is that we don’t want to build yet another interconnection standard. What we want to do is identify the interface that is common to all of them. The other side of the coin is, ‘What data are you going to transmit?’ There’s the data itself. That’s one aspect. It’s like the music, rather than the physical record. But it’s also the constructs around that data. If I’m sending you a video frame, I need to specify what format that video is going to be in for you to understand how to receive and process it.”

Digital twins, and beyond
One of the big focus areas for all of these market segments is digital twins, which all of the top EDA executives insist is nothing new in optimizing semiconductor design and ensuring that any changes will work as expected. But the number of acquisitions recently in multi-physics simulation, and the build-up of expertise in mechanical engineering, machine learning, and in-circuit monitoring are evidence of a much broader push and greatly enhanced tool capabilities.

“There was a vision about seven years ago about how these pieces fit together,” said Mike Ellow, executive vice president of Siemens EDA. “The market is moving from a build-up of the levels of hierarchy — where you integrate your way up the stack, make compromises, but you can still get your systems completed — to where software is now differentiating more of the value for what a lot of these industries are creating. That can be a car, autonomous vehicles in aerospace and defense, tractors, heavy machinery, or even medical. But the semiconductor is central to how that whole thing thing evolves. And one of the interesting things in the semiconductor industry is we can no longer deliver the silicon and wash our hands of it. You make changes in the software and really optimize your platform, and then the silicon has to match it better than it did in the past when there were standard hardware platforms and new compromises were made on the software. Now, software is the differentiator, so it’s the other way around.”

When Siemens acquired Mentor Graphics, it already had product lifecycle management (PLM), mechanical computer-aided design (MCAD), and computer-aided engineering (CAE tools), but it lacked the rest of the design flow and simulation. Today, all of the large EDA players are in active acquisition mode as they position themselves for the next big shift. And that shift includes much larger models and simulations, often involving the cloud, where any changes can automatically adjust other parts of the design, saving huge amounts of time and improving the efficiency and performance of the design itself.

“There’s no such thing as a single digital twin of your vehicle where you can see how the battery ages, how it deals with certain weather conditions, and how it behaves when the car drives through water,” said Keysight’s Faché. “What happens if it crashes? Can all of that be captured in that digital twin and be applied to a variety of conditions? You can’t imagine there is such a digital twin. But you could have a digital twin that helps you predict how your battery is going to age based on how you’re driving. And you can have a digital twin of the car that tells you how it’s going to behave in a collision. These are going to be very context-dependent for how you are exercising that physical system, and you will have a representation of that.”

What’s the rush?
This shift is underway already, but it is accelerating and widening. More industries are using virtual design, and AI/ML is making all of this happen faster and more efficiently.

“We’re really at the beginning here in terms of how software-defined systems are going to change everything,” said Ravi Subramanian, general manager of Synopsys’ Systems Design Group. “The world’s GDP is $101 trillion. That’s the value of all goods and services provided in the world. Right now, about $35 trillion worth of products are being designed, but that hasn’t become physical yet. And more and more of products are starting in design virtually. So how much of those goods will become electronically powered or software-driven?”

The answer may be difficult to pin down because of the speed at which all of this is changing, Subramanian said. “Because there are so many moving pieces, we have to be deliberate in saying, ‘Okay, what is it that we will do to make sure we provide the fabric to bring the other domains in and then be fluent about use cases?’ In automotive, OEMs are hiring thousands of software engineers, but they’re also seeing that it’s a complete culture shift. There are people who are saying, ‘You can’t touch software,’ because they’re coming from a classical car mentality. But with software growing so rapidly, we have many cases where customers are saying, ‘We’re overwhelmed.’ It’s a tremendous challenge, and that’s even before all the updates and all the permutations.”

Conclusion
How quickly different industry and market segments digitalize their operations and change over their focus, both technologically and organizationally, will vary greatly from one company to the next. But with AI/ML lighting a fire under many companies, and growing global competition to integrate and connect more pieces in systems to improve performance, reduce power, and lower the overall system cost, all indicators point to the need for better tooling and methodologies, and some rapid and fundamental shifts in EDA as that industry segment rushes to capitalize on those shortfalls.

To be sure, this is a massive change with a lot of uncertainty. It requires faster tool development, more flexibility, more integration of siloed engineering flows, and better sharing of information across an extended supply chain. But if it’s executed successfully, this could prove to be the biggest bonanza the EDA industry has ever seen.

Related Reading
Chip Industry Silos Are Crimping Advances
Development teams constantly wrestle with new technologies and tools, but often it’s the associated corporate structures that cause the greatest challenges.
Integrating Digital Twins In Semiconductor Operations
The industry must collaborate to develop a common understanding of digital twin technology across various hierarchical levels.



Leave a Reply


(Note: This name will be displayed publicly)