The Good Old Days Of EDA

Working in an industry during its infancy was an amazing experience, but it had some different challenges.

popularity

Nostalgia is wonderful, but there is something about being involved in the formative years of an industry. Few people ever get to experience it, and it was probably one of the most fortuitous events to have happened in my life. Back in the early ’80s, little in the way of design automation existed. There were a few gate- and transistor-level simulators, primarily for test and a few ‘calculators’ to help with some design tasks, such as for minimizing Karnaugh maps. Quite amazingly, SPICE already existed for analog simulator. Can you believe it is 50 years old?

What made this period special was that you could have several ideas per day about new tools that could be created. Many of them could be prototyped in a week or two. Some took a little longer, but what became the most popular simulator of the day – Hilo – was the result of about 10 man-years of effort by several grad students, including Phil Moorby and Simon Davidmann. It took substantially more effort to properly commercialize it and maintain it, and that is where I joined the team.  The rate of development was amazing, even with six of us sharing a VAX 750, with its 0.5 MIPS, 1MB of RAM, and a total of 160MB of disk.

As an early member of the Hilo team, I did core development, maintenance and field support. My territory was the U.S., and I would travel around the states for a month with my source tape in hand, load it onto customers machines and fix their bugs. Those fixes were taken back to the home base and integrated with fixes that other people had made. Porting to a new operating system — and there were many of them in those days — typically took a day. The reason for that is thanks to a lot of good thinking and software decisions. Talk about design for portability!

Every year at DAC, you would be amazed to see what new things had appeared since the previous year. Of course, most of it was smoke and mirrors and, if anything, they were prototypes and demos that were being used to find out how interested people were, and if it was worth finishing them. A recent conversation with a colleague made me realize how much has changed. It is tough being a startup in EDA these days, and not just because of business practices. Nobody is going to develop and have commercial success with a new place-and-route or logic synthesis tool. The amount of man-hours invested in mature solutions presents a formidable barrier to any new entrant, no matter how good it may be. Perhaps a student can come up with a new algorithm and demonstrate its value, but success depends on it getting incorporated into an existing tool.

As the industry has matured, it also has become incredibly intermingled. Place-and-route is impacted by delays and power, which in turn affects thermal, which also impacts timing. 3D development will only make that worse. Again, this makes the notion of bolt-on new capability more than most students or colleges can contemplate.

The only place for new ideas is where disruption is happening, and today there are two of those. One is with packaging. Packaging has seen very little automation in the past, and now with the rapid development of new packaging methodologies, it suddenly has become fertile ground. We can see quite a battle emerging to control this highly lucrative market. Foundries, OSATs, test companies, and EDA all see this as a major new market and want to carve out their piece of it.

The other area is at the system level. EDA has tried to get into this space in the past. Companies have spent a lot of money on it, but have little to show for it. There are a few tools in the SystemC space, high-level synthesis, and some architectural analysis tools. The problem has always been that there were too few people actually doing this work to make the market large enough.

But that could be changing. RISC-V has shown there is an appetite for tools that help to customize processors, such that higher value can be obtained from them. We have seen the need for higher abstraction verification tools, and there are a few startups in this area that are keeping their heads above water. Chiplets could mean that some companies become integration and software companies, and they would need tools for quick architectural exploration, chiplet selection, and interposer design. Would this just be an extension of existing PCB tools, or is there room for a new class of tool? High-level performance and power analysis would be an essential part of this.

It is unlikely that a new major EDA company will develop, but it could mean a much better exit path for startups in the future, and that is what is required to get the work started. Existing EDA companies could decide to throw the dice themselves with development, but they have shown themselves to be much better commercializers than innovators. While they have created amazing tools, including some from the ground up, it always has been more cost-effective to create them within a startup environment. Besides, they have a lot to do to keep up with the technology advances.

Going from idea to product is a much longer slog than it used to be. Some people probably enjoy that, but I miss the rapid flow of ideas, the turning on a dime, accepting failure quickly so you can move on, and the vibrancy of the good old days.



5 comments

Bill Martin says:

Brian,
The early days was a great time to be in a vibrant and growing industry. The industry eco-system has matured but as you mention, there are areas that are being re-invigorated since many of the ‘nucleus’ functionalities have been codified for decades and new entrants face a tall, cement wall that is difficult to break down to gain traction. Lots of hours, lots of working around bugs and sleepless nights as you awaited your first silicon from the fab….possibly having your hair turn grey/white or fall out. But it was thrilling to see how semiconductor/EDA quickly morphed to ASIC designs enabling many more companies and engineers to create their own silicon designs rather than purchasing off the shelf products and breadboarding them together.

Cheers and have a great and safe Holiday Season!
Bill

Brian Bailey says:

Happy holidays to you as well Bill. I think we are entering into a new good old age thanks to a raft of new areas, new problems and new solution spaces with AI/ML.

Peter Flake says:

Have you looked at the paper or presentation on Verilog in the ACM HOPL V conference?

Eric Cigan says:

Hi Brian, we continue to see strong growth at the system level; of that is occurring in the FPGA and programmable SoC space, but we’ve seen a boom of interest in using high-level synthesis with MATLAB and Simulink. Customers are using HDL Coder and Enbedded Coder to target RTL and processor workflows, with SoC Blockset for hardware/software codesign. In the last year we’ve seen strong adoption of our MATLAB-to-SystemC workflow with Cadence Stratus HLS … so the system-level design space is most definitely on the rise.

Brian Bailey says:

YEs, I have read that paper, and it is certainly worth a read if anyone wants to know about the origins and thought process behind Verilog. Did you mean anything specific in relation to the blog?

Leave a Reply


(Note: This name will be displayed publicly)