A Price To Be Paid

No change is free, but sometimes it cannot be quantified enough to make a decision.

popularity

Ancient wisdom says you should be careful what you ask for, because you just might get it. This was certainly true many times during my career within EDA, and I am sure it is still happening today. Sometimes the outcome was not what was wanted, or the price was higher than expected. As an example, consider VHDL, the language that was meant to correct the problems of Verilog.

One of the problems with Verilog is that the simulation semantics were not well defined for 0-delay propagation and this led to simulators evaluating things in different orders and sometimes reached a different result. Even recompiling and running on the same simulator might lead to a different result. VHDL fixed this, but this meant extending the complexity of the simulation loop and thus simulations ran slower – all the time, not just for the rare cases that experienced the problem.

Users said they wanted the ability to define value sets externally from the language as a library. While this may have looked like a good extensible language capability, it wasn’t really required and its inclusion led to simulations running slower. SystemVerilog went in the opposite direction and defined all of the value sets directly in the language, one of which, the bit, led to an order of magnitude speedup. We all know the result, and today VHDL is a niche language used in a small number of market segments – like FPGA-based design.

The industry has always wanted faster simulation. Researchers and EDA companies have asked, “What are you willing to give up?” They tried timing back in the days of gate-level simulators, and produced lightning-fast unit delay simulators. Nobody wanted them, often because there was one place in their design that broke the rules. In the early RTL days, cycle simulators and accelerators were created, but nobody wanted them because there may be one case in their design that broke the rules.

With the advent of design IP, the amount of design time that could be saved was touted and lip service given to the impact it would have on verification. It was just said that the design would be of higher quality and require less verification, but we all know that none of that was true. The IP industry was not interested in providing a high-level functional model of the IP and taking on the task of guaranteeing that the model and the implementation matched. Even today, few IP companies are willing to do that. So verification is stuck with having to verify using an implementation model of the IP.

New design languages have been developed and presented to the industry — ones that would have directly targeted the fundamental constructs of hardware and would have had an enormous impact on the time that would be spent in verification — but none has been successful.

Even high-level synthesis, which could demonstrate superior productivity and quality of results for certain types of block, struggled initially because it did not fully address the verification methodology required to go along with it. While the design input language was now C, no C-based verification flows existed.

Verification has never been seen as anything other than a cost burden, and that is why I believe verification is so expensive today. It takes more people and more time to get to an acceptable risk level, and there is nothing on the horizon to change that. Verification technology has advanced significantly, but it is dragging along such a huge anchor that making any significant headway is nigh impossible.

The industry is so ingrained in what it does that even a revolutionary approach is unlikely to succeed. Back in the late ’80s, when RTL synthesis was beginning to be accepted, it was easy to put the old and new methodologies side by side and compare the results and the productivity. The results were inferior to the best designers at the time, but it provided a 10X or more productivity boost. What new language, tool or methodology could we set against what we have today that would provide that sort of comparison?

The EDA industry has tried several times. While clearing out some old records, I recently came across a 100 page report I compiled while working for one of the major EDA companies. It was a plan to develop an Electronic System Level (ESL) tool suite and methodology that would advance both the design and verification aspects of the methodology equally. It was based on existing tools within the company, some new technologies that needed to be acquired, some research programs that would be funded, and an insertion plan that showed how to demonstrate the effectiveness of the methodology.

It was always that last part that presented the problems, even by companies actively pushing us to head in that direction. It was their internal research groups that wanted to make the changes, but they were also notoriously unsuccessful at persuading their own development teams to even take a look, let alone adopt.

Change is tough and while existing methodologies do not completely fall apart, they are unlikely to see significant change. It is easier and safer to look for incremental changes and patches to what is already in place.



1 comments

Jim Lewis says:

You referred to FPGA as the niche market. A number I heard some time back is that there something like 30 FPGA starts for every ASIC start?

Do you have a different number? Because I see design as being either ASIC or FPGA. Hence, in terms of number designers, it would seem like that perhaps FPGA is dominant.

Or are you talking in terms of dollars spent in the market? In which case ASIC tools and SystemVerilog simulators are clearly more expensive than FPGA tools and VHDL simulators?

Leave a Reply


(Note: This name will be displayed publicly)