Will AI Disrupt EDA?

It’s been decades since there was a disruption within EDA, but AI could change the semiconductor development flow and force changes in chip design.

popularity

Generative AI has disrupted search, it is transforming the computing landscape, and now it’s threatening to disrupt EDA. But despite the buzz and the broad pronouncements of radical changes ahead, it remains unclear where it will have impact and how deep any changes will be.

EDA has two primary roles — automation and optimization. Many of the optimization problems are NP hard, which means the most optimal answers are not possible in polynomial time, especially as design sizes grow. Over time, heuristics have been developed that get “good enough” results in a reasonable amount of time. And while it’s conceivable that AI can deliver something comparable, or even closer to the optimum, it could turn out to be more evolutionary than disruptive for design.

Disruptive innovation is something that normally causes a change in a market. A hypothetical question might be, “If EDA could deliver the optimum result in zero time, how would the semiconductor industry be impacted?” Time-to-market would accelerate, and designs would have slightly better PPA. But whether that would be enough to cause a significant increase in the number of design starts or lead to new markets being developed is unclear.

Under those hypothetical conditions, design creation and verification would still be limiters. Generative AI may be able to improve on that, and there are encouraging signs it can. If design and verification time are significantly reduced, that almost certainly would create new markets.

EDA has been disrupted in the past, and the problem with disruptions is that they are often not obvious until they have happened. “In some cases, people know a disruption is coming, like Kodak knew about digital printing, but they just could not bring it to market,” says Prith Banerjee, CTO at Ansys. “There are three types of innovation horizons. Horizon one innovation is shorter term. What features should the next version of the tool have? We know what features because they exist in the market. You are selling to the market, you’re watching competition — 70% to 80% of investment within large companies are in horizon one.”

“Horizon two” involves adjacencies. “For example, you are selling a product designed to be on-premise and want to go to the cloud,” adds Banerjee. “Innovation is needed, but we will figure it out and we will be successful.”

A number of compute-based disruptions fall into this horizon two category. “Computers used to have very small memory, and then you started to have larger memory,” says James Scapa, founder and CEO of Altair. “We changed how one of our tools worked, and that innovation was disruptive in that market. Essentially, we brought all of the models into memory. That change meant we were about 30 times faster than our competitors. A similar change happened with HPC. The business model associated with cloud computing is going to be one of the big changes that comes into the EDA world. And the business models that go along with that are going to be somewhat disruptive. It is important to appreciate the computing developments, understanding where computing is going, and how to leverage computing.”

Another transformation of this type is still in progress. “Think about parallelism,” says Jan Rabaey, professor emeritus in the graduate school at the EECS department at UC Berkeley and CTO of the Systems Technology Co-Optimization Division at imec. “People used to say that parallel computing is a really bad idea because we don’t know how to compile into it. Instead, we should take a single processor and make it as fast as possible. Then, power issues popped up and we cannot make them go faster anymore. So suddenly, parallelism was a good idea, and that was a disruption.”

The remaining 10% of investment goes into the “horizon three” type of innovation. “This is not part of your current R&D, and not targeted toward an existing market,” says Ansys’ Banerjee. “A classic example is when Apple came up with the iPhone. That was disruptive. Amazon came up with AWS, their web services. That was disruptive. How does a large company do disruptive innovation, because it is not an accident? It takes a process, and you need to tap into the places that make innovation happen. It’s in academia, it’s in the startups. You should constantly monitor what’s going on with startups, and then have a central R&D team to try to invent some of the things yourself. But that central team doesn’t have to invent everything. Part of it you do organically, part of it is bringing technology into your company.”

Looking back, we can see disruptions that have happened within EDA. “If I go back to the 1980s, we saw a set of ideas that were born initially both from academia and startups that changed the way we did design,” says UC Berkeley’s Rabaey. “EDA started driving design using standard cells. When you first looked at it, it appeared to be a really bad idea. It is very constrained. You put the cells in rows and things like that. But it made automation possible. That basically led to logic synthesis, where we can start thinking about logical functions, optimize them, have a set of tools that help us to translate a high-level description into something, and it is automated. We take this for granted today. There are other areas — simulation, verification, behavioral synthesis — these are areas that ultimately created some form of disruption.”

Over the last 20 years, there has been almost no disruption within EDA because the industry has largely been on a linear path. But that is changing rapidly as Moore’s Law shifts from planar designs to multiple chiplets stacked in a package.

“Disruptive changes are easier when the status quo is bad,” said Chuck Alpert, R&D automotive fellow at Cadence. “Consider design teams. They may know something is going wrong. Perhaps the engineering budget is out of control, or they’re trying to do a new design and don’t have the engineering skills. They must do something disruptive. Today, we see design complexity explosion. There’s a lack of scaling. There are things that design teams can run into that forces innovation. These are cases where the status quo is bad, or on the way down. For EDA companies, it might happen when you are not the market leader. You are behind and have to do something disruptive to catch up. Or maybe you have been the market leader, but the code base is written in COBOL and nobody knows it anymore. You are going to have to make a change because the trend is going to decline, and you are in an innovate-or-die situation.”

Opportunities for innovation are out there, especially in an innovative culture. “The emergence of AI and large language models is capable of a lot of transformation, and cloud computing, for rapid scaling,” says Altair’s Scapa. “Business models — not just technology — are part of how you disrupt. It is really hard for startups in EDA because of the excessive dominance of just two companies. They have been acquiring and eliminating startups and competition for a very long time. This has impeded innovation.”

By looking into the future, some pressures can be identified and dealt with. “What is the disruptive cycle?” asks Rabaey. “There are a number of them on the horizon. The nice thing about roadmaps is you can identify problem that might arise 10 years into the future. That’s where academics are good — looking at those roadmaps and identifying a new paradigm that can arise as a result. For example, scaling is going to last for another 5, maybe 10 years. What do we do about it? Disruption is something you don’t choose. The only time to take the disruption path is when you hit a wall, when you suddenly figure out, ‘I cannot go forward any longer.’ We have to rethink the way we do design. One possibility is to start thinking about the third dimension, where you are layering different technologies on top of each other. The easy way forward with this is to map the old architecture to this. But that’s not going to give you a lot of gain. You have to rethink how you use this.”

Sometimes change is forced from the outside. “Design is moving from chips to systems,” says Banerjee. “If the goal is to design an EV powered car, my requirement is not just an RTL input. My design requirement is an electric vehicle, which will go from 0 to 60 in one second with a range of 500 miles, and it has to be level five. Those are my requirements. The EDA industry is focused on designing chips. You have to design the power electronics, which is a power electronics simulation, combined with a battery, combined with the motor design, and then the aerodynamics for workload. It’s a multi-physics world out there, which is incredibly complicated. Then you have the software, which has to be written and automatically compiled from the system-level specification and then verified.”

AI within EDA
EDA companies have been quick to adopt some forms of AI within their tools. “Reinforcement learning is being used to solve optimization problems,” says Stelios Diamantidis, distinguished architect of artificial intelligence at Synopsys. “People now use reinforcement learning to experiment, collect data, build better metrics to drive optimization, and automate those optimizations, as well. The technology itself can be applied to additional problems. We started out with optimization of physical layouts and floor plans, clocks in some topologies, DTCO, and other physical kinds of applications. We have since applied the principle to problems like verification, where reordering tests or changing seeds can help you accelerate coverage or pursue bugs, to test where reordering vectors can help you achieve coverage for manufacturing test faster.”

But AI is unlikely to replace existing EDA tools. “I contend that we have good EDA products that are in use by our customers, so the status quo is high,” says Alpert. “If we decide to make a new product with AI, we are going to pay a huge price. Maybe we would get some benefit in the long term. If we take the entire product teams and say, let’s start over and build something new, it’s going to be very painful. Eventually, you might get there, but you’re going to pay a huge cost in the meantime.”

The EDA industry is all about maintaining continuity, making sure they deliver to their customers the tools needed to get their next product out. “We have to protect our $2 billion business,” says Banerjee. “A startup starts with nothing. But it remains difficult for customers to embrace new technologies to solve their problems. That is the challenge, not just for EDA, but in general across industries, which is why I’m seeing horizon three — the vision, working with startups, and then acquiring the startup that has proven those kinds of technologies.”

Alpert agrees. “Disruptive technologies are hard for almost any industry, not just EDA. They can put some resources on it, but not too much. Or they can wait for somebody else to be innovative and buy it. That’s another strategy.”

But where have the startups gone? “In the last 10 or 20 years, the ecosystem that existed has collapsed,” says Rabaey. “There was a time when you had a vibrant research space in EDA. Go to all the top universities and they were working on tools. You won’t find them anymore. They don’t exist. Perhaps you have a great idea, and academics can publish the paper, but they don’t build that product. The role of startups was really important for that, and in the ’90s it was a vibrant world. It was all these small operations that came up with ideas and tried them out. That has collapsed, as well. But the ecosystem might arise again.”

GenAI’s impact
Investments are pouring into GenAI, but in EDA there is far less activity. “GenAI is real, and is going to deliver real results for us,” says Scapa. “But there’s a lot of hype, and the amount of investment is out of alignment with the return that we’re seeing today. There’s going to be a fall-off, and then a typical slow rise, because GenAI is the real deal. We are doing interesting things with traditional machine learning, as well, which also has significant potential.”

But the real potential of GenAI in EDA appears to be somewhat tangential. “EDA doesn’t create design,” says Rabaey. “But it is driven by design considerations. AI will become a disruptive part of the design process. AI will become a design tool that will help us to explore the big space of options that are out there.”

The second wave of generative AI is addressing the automation problem. “Pretty specifically, some of the key industrial challenges,” says Synopsys’ Diamantidis. “This has more to do with economics, geopolitical pressures, availability of talent, and the ability to do more with less. In the second wave, we are able to take data or design environments. We are able to train models at very large scales with this data. And then we are able to contextualize them toward different tasks that are specific to designer activities. We are certainly solving the human computer interface problem. We can now explore immense complexity.”

Perhaps the biggest ROI for GenAI is productivity. “One of the things that we work on is coaching people through the development process, helping them to improve their problem solving by leveraging generative AI,” says Erik Berg, senior principal engineer at Microsoft. “Where does that data come from? I believe that the richest source of data that we have available is in our engineer’s heads. The tools that I’m building are not only providing solutions to our engineers, but also scraping the results, other data, and results from their heads at the same time.”

This is being seen in many parts of the design community. “GenAI can definitely help non-expert users get better,” says Vidya Chhabria, assistant professor at Arizona State University. “It can help non-expert users ask the right kind of questions — more intellectual questions. It can help a non-expert user get up to speed with new designs and new EDA tools. And perhaps it can help an expert users become more productive or be able to work faster.”

But will any of this cause a disruption? “Despite all these technologies, it still takes like four years to get the chip into a slot,” says Diamantidis. “I’m talking about gathering the requirements, architectural exploration into design entry, verification, insert test, prepare for instrumentation for both silicon diagnostics and data mining — the whole enchilada. It takes a lot of people, a lot of money, and a lot of time, which means that it’s not really changing the fundamentals or the economics of the semiconductor space.”

Conclusion
Disruption is hard and often not seen until it becomes obvious. Many people have been looking at the progression of technology, the changing design practices, and the changing landscape from chips to systems. In addition, everyone believes that all forms of AI are likely to be useful in solving those problems. Looking at the landscape today, nothing looks to be disruptive.

Editor’s Note: All comments were extracted from two panels held at the recent Design Automation Conference: “Why Is EDA Playing Catchup to Disruptive Technologies Like AI?” and “Generative AI for Chip Design – Game Changer or Damp Squib?”

Further Reading
Verification Tools Straining To Keep Up
Widening gap in the verification flow will require improvements in tools. AI may play a bigger role.

 



1 comments

Ev Roach says:

Excellent article thanks Brian. It would be great to see an article on breakthroughs in automating analog Chip design

Leave a Reply


(Note: This name will be displayed publicly)