Time To Pay The Piper

EDA has been underpaid for so long that the necessary levels of investment have not been made. Now the tools that designers need just haven’t been thought about.

popularity

The Pied Piper of Hamelin is a German fable about a rat catcher who used his magic pipe to lure away rats. When he was not paid by the town, he used his pipe to lure away all of the town’s children. I am not suggesting that exactly the same is true for the semiconductor industry and having not paid EDA, but I do not think they have paid enough and they will now have to pay a hefty price.

Don’t get me wrong, this is not the fault of the semiconductor industry. The EDA industry did most of this to themselves. EDA companies were so interested in growth at any cost, by stealing accounts from each other, that they discounted way below the fair value of the tools. They trained the semiconductor companies to want and expect bargain basement pricing.

But this had unexpected knock-on effects. First, it made the environment for EDA startups less attractive. Startups face an uphill battle as it is, but having to compete on lowball pricing made things worse. Most ran out of money and got bought by the larger EDA companies for way less than they were really worth.

That led to less investment in EDA and fewer and fewer startups. In the long run, that means fewer new tools or innovative ways of solving the problems of the industry. The large EDA companies, in general, are far too busy maintaining and doing linear development on the tools they have. With each new problem that comes up with the smaller nodes, additional tools become necessary and that gets added to the increasingly large roster of tools in their portfolios. Recent examples are , verification IP, and power-aware design.

They are also investing huge sums of money on the newest nodes, and the jury is out if they will ever get the return on investment from that with so few semiconductor companies likely to want to buy them. These investments, like the ones made by the fabs to develop the process, are made years ahead of the first customer tapeouts. Of course, EDA companies cannot stop that development. Otherwise they would lose customers, and with it all of the dollars, because semiconductor companies also have been trained to expect all-you-can-eat deals. If they fail on the newer nodes, they would lose for all of the nodes.

EDA revenue has remained a relatively fixed percentage of semiconductor revenues, but to stay at that percentage it has taken a lot of investment, including performing most of the block-level design for the systems in the form of IP. The size and complexity of those IP blocks has also been increasing. EDA has also gone way outside it core space in search of new revenue streams, looking at embedded software, security, system design and others. If you were to look at core EDA revenue, it has probably declined as a percentage of the industry it supports.

But the knock-on effects go even deeper. There is almost nobody doing research in the area of EDA, which means that even the groundwork has not been done on some of the things that the industry is now asking for. Universities are not interested in the area, and there are much more exciting things for research students to be working on. If the startup market had been healthy, that may not have been the case because it would have provided a way for them to monetize their research.

One such example came up while writing my DFT article. It is the lack of tools for analog test and the more fundamental work that is necessary to develop the fault models. A couple of EDA companies are looking into the creation of analog fault simulation, but without the research to guide them, the fault models will not have been proven or standardized. Thus, there is likely to be no compatibility between the tools in terms of the quality of results.

We have seen companies like Verific and Invionics start selling to design companies in increasing numbers as designers are being forced to create more of their own tools. At least those companies will be reminded how expensive it is to create and maintain those tools.

The price that semiconductor companies will pay is not having the tools they want when they want them. EDA will find a way to create those tools if enough customers ask for them, but EDA is less likely to be speculative these days. That was the role of the startups.



3 comments

Kev says:

The EDA companies charge what the market will bear, and if anything they overcharge for crappy old software that doesn’t work very well.

The problem is that last piece: “EDA will find a way to create those tools if enough customers ask for them” – the customers don’t know what they really need, and neither do the EDA execs, so they’ve been stuck in a methodological rut for decades. Most of the EDA companies are a bunch of silos looking much like the companies they were when acquired, with minimal vision and a definite lack of synergy.

The next step in EDA is a move to software-defined-hardware, which will probably come out of the need to program the new AI/CNN hardware, and will likely be mostly open-source (on the design side).

Karl Stevens says:

Hi, Kev: One EDA vendor is bragging about synthesis speed. Whoop di do!
They have no understanding of logic design which needs to be done before synthesis. Yes, there is need to have at least a good estimate of how much chip resources are used at some point in the design, but doing synthesis, placement and route is unnecessary for every iteration. Boolean logic can be simulated without an HDL simulator.
Get the logic right then do the physical design. (duh?)
I cannot find a tool that is useful for logic design because the tools only focus on physical design. How sad.
Especially now designing systems and using IP the lack of tools for interconnections and interfaces really hurts.
I was right in the middle of architecting, designing, and building the IO for IBM’s first micro controller. And 30 years later it annoys me that the so called “design tools” have skipped the most important part of design.
The problem goes back to the early days when it was a miracle that an HDL could be SIMULATED!
Twenty years before that I simulated the memory bus for an IBM mainframe using GPSS. That was about 15 -20 years before the first synthesis. (the bus did work correctly)
A that time, EDA meant logic diagrams printed by computer and printed circuit board wiring done by computer.
I also simulated asynchronous logic besides the memory bus that of course used asynchronous hand shaking.
So here are some things I want/need:
A simple syntax to enter Boolean expressions to control state changes and to initialize/created timed value assignments.
A modular structure so the compiler can do the interconnections and construct modules.
An event/time driven model that has single step/cycle.
Capability to display node values at breakpoints.
Ability to format HDL source for physical design under user control to get a resource estimate and for final build.
And if anybody cares I have a running prototype. So much for the view that no one knows what they want/need.

Kev says:

I started this project – http://parallel.cc – after I couldn’t get support for asynchronous design in SystemVerilog.

Leave a Reply


(Note: This name will be displayed publicly)