Managing EDA’s Rapid Growth Expectations

EDA is growing quickly, fueled by many changes in the chip industry. But can it keep up and continue to satisfy the needs of all its customers?

popularity

The EDA industry has been doing very well recently, but how long this run will continue is a matter of debate. EDA is an industry ripe for disruption due to rapid changes in chip architectures, end markets, and a long list of new technologies. In addition, recent geopolitical tensions are bringing a lot more attention to this small sector upon which the whole semiconductor industry rests.

Despite those changes, EDA companies have managed to stay remarkably relevant amid this swirl. Over the past two years, the semiconductor industry, as measured by the SOX index, has remained relatively flat. But at the same time, public EDA companies’ stock prices have risen about 60%, and large conglomerates that have EDA divisions claim their EDA growth rate is higher than the market.

“The pace of change in semiconductors has always kept us a young industry,” says Neil Hand, IC product marketing director at Siemens EDA. “One of the reasons EDA really is healthy is that we are able to adapt to the massive amount of change in the marketplace. We’ve always been forced to adapt to change. Some industries get complacent. They get stagnant, and they start to be slow to adapt. I don’t think you can say that about EDA.”

One of the large questions hanging over the industry, though, is whether EDA can adapt fast enough to meet all of the demands being placed on it, or whether it is being stretched to the point where new companies can make inroads. This is largely a function of the amount and pace of change across the chip industry, including AI in the tooling suites, the increasing importance of hardware/software co-design solutions, and much more. At the same time, system companies are going through a number of mega trends, such as domain-specific computation, autonomous driving, and increased attention on security, among others.

“Beginning from RTL level, most of the necessary tools are available and doing their job very well,” says Andy Heinig, head of department for efficient electronics at Fraunhofer IIS’ Engineering of Adaptive Systems Division. “Some gaps do exist with system-level tools. But the whole market is saturated, and nobody wants to spend enormous budgets for the development of new approaches. Only really new concepts and approaches will allow the gap to be closed on system level tools in the future.”

But all aspects of the flow require continuous investment. “Our core algorithms are improving, and with every new technology node we have to make some changes in the way we optimize designs,” says Vinay Patwardhan, product management group director in the Digital & Signoff Group at Cadence. “That is because of new constraints or new effects coming in with each node. On top of that, we have to do the system-level co-design and co-optimization. The same optimization algorithms, which were previously working to get the best post route results, have to look past the chip and take into account both the package effects and the system-level effects. Similarly, system simulations have to take in the die-level effects. We have to expand all EDA algorithms to become more aware of not just the localized effects, but the global spread of everything it touches.”

Industry evolution
In the early days of EDA, there was a huge amount of research conducted in major universities around the world, and hundreds of startup companies. It then went through a massive consolidation phase. “There always have been these two streams of providers within EDA, the point tool, best of breed, and the full solution provider,” says Marc Swinnen, director of product marketing at Ansys. “In the ’90s, Cadence was famous for providing the full flow and would buy point tools from startups to augment their technology, and those were continuously refreshed with new startups. That model has gone by the wayside since there are very few startups in EDA today. The full-flow provider became quite popular in the early 2000s, and that proved to be very successful for them. It provided a single point of contact and that sounded attractive, especially to the purchasing agent.”

Economic and technology moats have made it difficult for startups. “Many of the startups in the ’80s were routers startups,” says Rob Aitken, a Synopsys fellow. “If you want to write a router today, it’s really hard. The design rules are so complicated and access to them is so restricted, that you can’t just put a couple of grad students in a room and have them conjure something up. The areas where there’s freedom for a small group of creative people to innovate is small. You can’t get five grad students together and have them write something that competes with a Cadence or a Synopsys tool. There’s not that much room for them to play in the EDA space, because a lot of that has already been defined.”

While research continues, it has evolved. “I can use the number of submissions to DAC as a guide,” says Joerg Henkel, chair for embedded systems at Karlsruhe Institute of Technology and general chair for DAC 2023. “I have been on the technical committee for many years and seen a steady increase in research papers. This is mainly driven by certain topics like security, designing secure systems or figuring out issues with side-channels and IP. We also see a lot of papers about architectures for machine learning. There are new technologies, such as non-volatile memory technologies that have lots of papers. Design automation is still one of the two biggest channels, but it is more about new technologies, like machine learning technologies, that have made their way into traditional design methodologies, and that is something of a renaissance. We have even had an increase in high-level synthesis papers.”

Still, the recent geopolitical situation may provide an incentive for change. “That has been less R and more D over the past few years,” says John Park, product management group director in the Custom IC & PCB Group at Cadence. “But with the CHIPS Act, this is billions of dollars of funding, and it’s creating a number of startups in packaging and chiplet development, and in on-shoring OSAT capabilities. Programs that are DARPA-funded tend to have academia in almost everything. All the big universities are investing a lot more in innovation for 3D, and 3D heterogeneous integration. As a package designer, I have never seen this level of innovation and investment and research going on at any point in history. While a lot is going to the semiconductor side, I estimate about 18% to 20% will go to multi-die packaging.”

Similar kinds of investment are happening in other places as well. “What the CHIPS Act, and the European equivalent, does is create an incentive for there to be more research, and that opens up more avenues,” says Siemens’ Hand. “The nature of research is that the more things that are going on, the more likely there will be a breakthrough discovery. If you’re only looking at two or three things, it’s hard to have a breakthrough. If you’re looking at 100 things, someone is going to make the breakthrough. That becomes really interesting, and it’s exciting that you’re starting to see a decentralization of the R&D efforts, which is going to increase the probability that we’ll start to see more breakthroughs.”

Could it lead to an industry disruption? “Maybe, but if any industry gets set enough in its ways of doing things, it becomes harder and harder to disrupt,” says Synopsys’ Aitken. “But the disruption, when it shows up, is more and more disruptive. Essentially, somebody could disrupt the industry by rethinking a problem and saying, ‘We don’t actually have to do it this way.'”

The general consensus is this is unlikely in the near future. “Research still happens at an algorithm level, or at the device physics level,” says Cadence’s Patwardhan. “I don’t see something revolutionary that somebody is doing to completely change how things are done today. It’s more incremental to what existing methodologies are, and they’re adding pieces of it, adding to it to make it stronger, faster. But a whole new revolutionary way to do things? I’ve seen less of that.”

Some believe that AI may make it happen. “When you look at DAC, when AI papers started, it’s going through all kinds of topics,” says Henkel. “There is a large percentage of papers in different topics that have AI in it. So yes, researchers learn it, how to apply it, how to make it beneficial. We are surprised at the maturity of papers from physical design. It now seems to be an obvious thing to apply machine learning to physical design.”

Convergence
There is a convergence happening in the industry that is bringing what used to be sperate domains together. “With the advent of 3D, we’re seeing a major sea change in the organization of the EDA, because 3D-IC has thrown the door open to so many additional physical effects that were siloed before,” says Ansys’ Swinnen. “You had a chip team, you had a packaging team, you had a PCB team, and they could be in different countries. They never had to talk to each other. But now with 3D-IC, all this has to be scrunched together. And no single company has all the technology you need for to do a 3D-IC.”

This is echoed by others in the industry. “As the you move into 3D and the multi-die space, you wind up having to consider more and more effects,” says Aitken. “You go into more and more domains and you get to the point where, no matter where you’re at, no matter how broad your company’s tool suite, there’s something right on the boundary of it so that you’re still going to have to work with somebody else.”

This is the convergence of system and chip design. “It means design flows, and design methodologies, are much more advanced, sophisticated, complex than they were in the past,” says Park. “Where package designers never had to worry about doing formal signoff of DRC and LVS, now they have to. That is the systems world entering the die design world. Die designers, the designers of monolithic ASICs, never knew what signal integrity was. Now they are having to validate, in a multi-chiplet system, if they are compliant with the communication standard, and that requires signal integrity tools. There are lots of those types of examples, but what it means is this convergence of expertise and tools from the systems world and the ASIC design world coming together. The number of tools grows, the expertise grows, people are trying to figure out who does what in these types of design flows.”

3D-IC is not the only point of convergence. “An increasing number of things come into consideration for optimization,” says Hand. “To manage that you start to deal with layers of abstraction. When you have layers of abstractions, it allows localized decisions to be made, and be informed. You can do system-level design that has a rich set of data, and you’re making informed decisions, and you can do incredibly detailed physical-level design. They are still separate activities, even though they are linked and interconnected. What you’re going to see is not a move toward a single monolithic entity, for a couple of reasons. One is it doesn’t work. No one person can do everything. History has shown that in many ways. You do need to have connected flows.”

Finding out the information that has to flow to make that work is not yet decided. “There have always been compute-based solutions, or scenario-based solutions, but with AI techniques we have an opportunity to build a model,” says Patwardhan. “To use the model effectively, we have the opportunity to reference the model multiple times and take into consideration a whole lot of parameters at the same time, which was not possible before, using physical as well as logical parameters. There is an incredible opportunity for AI to play in this space, just given the amount of data, the number of variables, and the complexity of the problem. There is work going on in that direction, and over the next year or so there will be a lot of ideas discussed in public and private forums.”

The impact of some types of convergence are not yet fully understood. “We know we’ll have to care about thermal, and we know we have to care to some extent about mechanical,” says Aitken. “How much do we have to care about mechanical? How will that change with different materials? We are moving into an era of cooperation, because people have to get chips out the door and they have to combine tools from conventional EDA vendors with tools from people who historically haven’t done EDA.”

But this presents some challenges. “Data privacy is still on top of everyone’s mind,” says Patwardhan. “That means our customer’s mind, as well as ours. But there are more trusted ways to have collaborations today, both in technology terms as well as personal trust-wise. They’ve had longer relationships with some of these customers. The trust is building. We have more secure systems, so access to data and data privacy will be there, but we have found effective ways and solutions to work with our partners.”

To make this work, abstractions are the natural means of connection. “No one tool can do everything,  no one company can do everything,” says Hand. “What you see, as more and more complexity creeps into the flows, is we have to find ways to deal with that complexity, to allow ways of abstracting that complexity, for people to work with it. At the same time, we have to allow that complexity to be managed in a localized fashion. When you look at system-level design, you need to consider the physical domain, the fluid dynamics, aerodynamics, electronics. You are never simulating the whole thing. You’re simulating abstractions of those objects. You’re doing design at an abstract level, and then pushing down and decomposing to get to the implementation itself.”

This impacts both the companies in the EDA industry and their customers and partners. “Not that long ago, our largest customers had CAD design teams, and they were responsible for creating streamlined flows utilizing various EDA tool vendors,” says Park. “They made a flow and stitched together best-in-class tools to create optimized flows. And we’re going back to that. We’re going back to companies hiring experts in design methodologies to figure out how to build these flows, because that’s a competitive advantage in these complex times. If you’ve got a more optimized design flow than your competitor, you’re going to have shorter turnaround time. You’re probably going to get to market more quickly, probably with a better product. There’s more value placed in design flows than perhaps at any time in history, because of the complexity.”

But that is not the case for all companies. “We also see a lot of companies looking not for the best-in-class tool, but the most integrated tool,” adds Park. “That’s another paradigm shift for some people. Instead of going out and finding the best-in-class point tool from five different vendors, and figuring out how to stitch them together, they’re going toward more of a single-vendor flow where three of the five tools are best-in-class, but two of them aren’t. But they’re so tightly integrated with the flow that there’s an advantage with going with that particular methodology.”

There are a lot of things changing. “We know that the model has broken down to some extent,” says Aitken. “What we don’t know, is how much it’s broken down, and where the boundaries of the future, or the near future, are going to be. There’s an opportunity for some level of abstraction of the functionality of a design. There is an additional set of abstractions that we need to conjure up for the rectangle of GDS in the future. Everybody knew that there were packages, and that packages had power delivery issues and heat removal, and so on. But it didn’t matter so much because the abstraction level of a die was good enough. That is no longer true.”

“As an industry, we are spoiled for choice at the moment,” says Hand when discussing where EDA is investing. “We are going to be pressed a lot more into driving the third letter of our acronym, which is automation. We have to try to close some of the resource gaps that our customers are going to be seeing in the coming years.”

Conclusion
The EDA industry is healthy and growing. Macro trends both within the industry and within their customers’ industries means that the growth is likely to continue for a significant period of time. But that amount and pace of growth is creating some instability within the industry, and that may cause them to react in different ways than they have in the past.



Leave a Reply


(Note: This name will be displayed publicly)