Stuck In A Rut

Has EDA failed to innovate, or has the semiconductor industry shown an unwillingness to adopt new tools and methodologies? As always, the answer is somewhere in between.


In the DVCon panel session about open-source verification, the first part of which has been published along with this blog, you will read about a fiery debate between the panelists. This is regarding the ability of the EDA industry to innovate. On one side is the accusation that there has been no real innovation since 1988. On the other side, there have been fantastic advances have been made that enable billion gate chips to be designed, verified and manufactured.

In reality, both views are correct, and the fault does not necessarily lay with the EDA companies alone. They, like the companies they serve, are risk-averse. Semiconductor companies change as little as possible between each design. Incremental design means they can concentrate on the pieces that are new or different and can remain fairly confident that the pieces being re-used will not present too many challenges. New tooling or methodologies also present risk, and those change very slowly, as well.

Startups are not so constrained. Their role is to shake things up, to try new designs or architectures that are completely different. Perhaps even try new tools or methodologies if they exist. The hope is that if they get it right, the gains will be larger than if they played it safe. Many of them fail. Also, because they do not have the legacy to build upon, this is the only way they can get a jump on the large established companies. However, few of them innovate by using new EDA tools or methodologies.

The same is true within EDA companies, which provide the tools and flows their customers tell them they want. EDA companies solve new problems only when they have no choice. While I was employed by the EDA industry, I led a few attempts to create things that were completely new. While some of them met with an element of success, the majority failed.

There are two primary reasons. Sometimes, we didn’t understand the problem well enough, and we just got it wrong. The gains would not have been worth the time or expense of our customers to adopt them. There were other times when I feel that the gains actually would have been there, but customers were too risk-averse to give it a try. There were cases where the solution was ahead of its time. Some of them do succeed 20 years later.

But there was another problem that often got in the way. The business side of things. The EDA sales force is a high-touch, high-cost channel that is driven by commission. They sell what is easy for them to sell, and go for large deals that get them quickly toward their bonus. That means they are not interested in bringing new tools to market most of the time. It is not worth their time and effort.

This is where EDA startups come in. A startup’s success is sometimes not because of the technology itself, but because they have demonstrated that the market exists. They have found out how to make the penetrating sales and persuaded enough customers that the gains are real. This takes a different kind of sales force. Once this has been demonstrated, it is easier for the large EDA companies to take it from there.

There are times when I do not think the industry finished up making the right set of choices. I am not going to point the finger, but verification is one of those areas, and there really has been very little useful innovation for the past 20 years. Even before that, some of the innovation was, in my view, misinformed or done for the wrong reasons. The industry has been trying to make the best of a big mess. Verification is inefficient, ineffective, costly, time-consuming, and a drain on the industry.

It has been built on top of an overly complex set of languages and methodologies that put flexibility as a top priority. It has gotten to the point where to develop something new requires too much to be built and can only hope to replace a small part of the overall problem. The rest has to continue to be integrated with the remaining pieces of the methodology, making real gains limited.

I have seen new languages, tools, and methodologies developed over that period that should have been successful because they were much better than what exists today. The semiconductor industry is partly to blame because they have failed to change, to make the necessary initial purchases to bring those new technologies into the mainstream.

Is open source the way to fix the situation, or will that just add more patches to the boat that is taking on water? The opportunity is that with RISC-V, enough people have the same problem that they can usefully talk about collectively solving it. They all could gain if each makes a small contribution. Unfortunately, someone has to come up with a brand-new way to approach the problem first. That will take real innovation. But until verification breaks out of the existing mold it will not make any real advance – it will just barely keep up with the size of the problem.


Theodore Wilson says:

Thanks Brian for another excellent article.

I think there has been steady innovation particularly if I think back about bug density in the smaller chips I started with and bug density in the more recent SOCs. So something is going right.

I suspect teams that adopt CI/CD flows from software teams while informing that flow with all the data EDA tools provide might find they are producing higher quality products at a new lowest price and that they can finally afford to experiment and pivot quickly.

BillM says:

One of the biggest killers of something new is the internal sales force that focuses on the easiest path to commissions, bonuses and trips. Why expend more effort on a low ‘payback’ product? The problem is this is killing new products that could significantly increase sales into the company…Startups MUST sell that product since it is their only ‘trick’ in their bag. Imagine what could have been imagined/developed at these large companies IF their own sales force was compelled to sell new products…

Mike Thompson says:

“I have seen new languages, tools, and methodologies developed over that period that should have been successful because they were much better than what exists today. ”

Can you share your top three from this list?

DrZ says:

EDA and semiconductor companies are always looking for something “new”, but are picky. First, the path to the “new” has to be as easy as possible for everyone in the adopting company – easy to understand, integrate, market and sell. Unfortunately, this often boils down to obvious innovations in the same market and for the same users. Second, the “new” should be preferably a topic unrelated to anyone’s perceived domain of expertise to protect careers. So, a very little of “new” is left to be picked from, if any at all. Making money or expanding into new markets with the “new” is mostly only a distant third requirement, but the first to justify inaction.

Leave a Reply

(Note: This name will be displayed publicly)