The Value Of Innovation

Successful innovation does not happen in a vacuum. Someone must make a considerable gain for it to gain traction. Otherwise, it was just an interesting idea.

popularity

This week’s Design Automation Conference is all about the new things that are going on in the industry, both challenges and opportunities.

By this time this blog goes live, I will have moderated a panel about why EDA has not been open to disruption. While preparing for that, a number of thoughts emerged in my mind. First, we have to remember that EDA is a business whose role is to support the semiconductor industry. They need to be profitable, and they need to provide what their customers demand. Of course, this is the perfect setup to be disrupted according to Clayton Christenson, author of the popular Innovator’s Dilemma.

What the semiconductor industry has demanded is the ability to fill a chip with functionality at a somewhat fixed cost in terms of manpower. For that they were willing to pay a certain level of ‘tax’. That tax rate stayed fairly constant for several decades, and so EDA revenue grew at approximately the same rate as the development cost side of the semiconductor industry. Apart from a few periods where they fell behind, they have generally succeeded in providing the necessary productivity gains.

Disruption requires a significant change and that can come from either the semiconductor industry or by new EDA technologies being developed.

Disruptions to EDA

Along the way, various design techniques failed, such as the performance curve of processors, or memory access rates. These created unintended disruptions for designers. The entrenched industry would have been much happier seeing processor performance increase with every node. Instead, existing processor vendors were forced to invest in homogenous multi-processor chips, while the rest of the industry used it as an opportunity to invest in heterogenous multi-processor systems integrated into an SoC. But this and other such changes did not disrupt EDA because they only called for incremental shifts.

I cannot think of any change in the design community that disrupted EDA. There have been a few attempts, such as the introduction of asynchronous design, and to some minds this failed because EDA did not properly support it and no startup rose to fill the void. We have seen a continuous pressure for analog design to be made more predictable, or simpler, but nobody has been able to crack that problem. Manufacturing has supplied a long list of new issues and concerns with every node, and EDA has come up with the necessary incremental solutions.

Perhaps the best example of a disruption is the rise in popularity of the RISC-V processor. EDA has responded and they are developing tools to help with the design and verification of those processors. Companies like Synopsys have started to support this even though they previously had their own proprietary ISA. Synopsys has also purchased one of the few startups, Imperas, that was developing tools for these cores.

Disruption within EDA

So, what about disruptive EDA innovations? There have not been many of those over the past couple of decades, and there is considerable argument about who is to blame. However, it seems to me that disruption was not necessary and potentially not wanted.

The EDA industry spent a lot of time and resources on the development of electronic system level (ESL) tools and flows. This was based on research that had started quite a long time before and promised improved productivity, and the ability to handle more complex systems. It came at about the same time as the IP industry was taking off. This was a battle between top-down and bottom-up flows, and we all know which one succeeded. However, it may have only delayed the inevitable, because an increasing number of companies are now wanting more top-down focus.

Twenty years ago, verification changed from human-in-the-loop to model-in-the-loop. This was precipitated by constrained random test pattern generation. It certainly disrupted the way in which verification happened in the industry, and it almost disrupted the EDA companies. A new entrant quickly took the lead and left big EDA quite a way behind. But the incumbents recognized what was happening and basically thwarted the upstart by creating a standard that marginalized the advantage the startup had. That was the creation of SystemVerilog. Exactly the same thing happened a couple of years ago with the creation of PSS. They were technical disruptions but did not cause a business disruption like the change to RTL did.

Today, there is a lot of buzz around 2.5D and 3D design, but it is not clear if that will cause disruptions within EDA. It certainly will disrupt the packaging industry, and we see foundries and OSATs fighting over the newly created territory. It may also disrupt the IP industry, making silicon IP more attractive than soft IP. With that will come a significant business change. But it is unlikely to disrupt EDA, which cautiously is developing the necessary pieces of technology that may be required. I say cautiously because it is not clear yet if the semiconductor industry will demand everything that is being considered. For example, is place-and-route across stacked vertical dies needed in the near future?

A lot of people are talking about AI, and it’s unclear at the moment if AI is better than the traditional algorithms that have been optimized over the past 30 years. Certainly, if AI could be trained on all the data that was used to make those optimizations, then it would probably be better, but that data is no longer available. Only a very small fraction of it is, and to displace those algorithms it has to be a lot better. When those tools are re-factored, which seems to happen every 10 or 15 years, I am sure they will look at adding it. We also see a lot of AI being used to replace people in the loop for optimization flows. The big question here is, what is the end value they provide? If you need more licenses, more compute power, and probably more time, how much improvement do you get for that? This is a transfer of costs from manufacturing to development, and while this may work for high-volume devices, it is less attractive to smaller production runs.

Disruption happens for a reason — someone gets a considerable gain. This may be from a new market or new product opportunity. Could a change in EDA create a new market? Possibly, if they could reduce development costs by 10X or more, but that is only a tiny fraction of the total development cost. Would it create a market 10X the size it is today? I doubt it. Low-cost EDA has been tried before and failed. 10X better EDA may stand more chance.



1 comments

Ron Lavallee says:

My opinion of why asynchronous design has not been disruptive to EDA is not because it wasn’t supported, it is because handshake asynchronous design is only incrementally better. EDA must have supported this attempt because many PhD’s worked on and improved this approach but not enough to be disruptive. Chips were made, but, disruption must be proven before it is disruptive. For many innovators, the innovation is the easy part!

Leave a Reply


(Note: This name will be displayed publicly)