Dealing With Market Shifts

No tool lasts forever, but sometimes it isn’t the anticipated changes that cause the problems. It’s the ones that blindside you.


Back in the days when I was in EDA development, I was taken in by the words of Clayton Christensen when he published “The Innovators Dilemma.” He successfully introduced the technology world to the ideas of disruptive innovation. One of the key takeaways was that you should always be working to make your own successful products redundant, or someone else will do it for you.

One tool I worked on certainly fell into that situation. We had gone from obscure new technology to market leader in just a few years, and it was almost the poster-boy for the company. But then, the rug was pulled out from under us and the product fell just as quickly on the way down. It was not a new competitor that dethroned us. It was a shift in the market and the introduction of a new standard.

Was there anything that the team could have done differently? I don’t think so, even in retrospect, but it goes to demonstrate that advancing a tool in a technological sense is not always enough. You have to understand how designs, markets, and developer concerns change.

One issue that has loomed over the industry is the long overdue move in abstraction for hardware design. It initially was required 20 years ago, but the bullet was dodged by the adoption of a reuse philosophy. That limited the amount of new RTL going into any design. As an assembly language, RTL was adequate. It was completely proven for block development and enabled a fair degree of encapsulation. So why shift?

But IP reuse did nothing to solve the verification problem. While you only had to redesign a small fraction of a chip, you still had to verify all of it. Blocks continued to get larger, and verification kept up with that through a lot of new tools, languages, and methodologies. At some point, those blocks have to be integrated together and verified at that level, and then integrated again into larger sub-systems. Eventually you have the entire system described in RTL, and at each integration step you have to run longer and longer tests in order to do anything useful.

Emulation saved the day because simulation performance stagnated when single processors stopped getting faster. Simulation algorithms do not port well to multiple processors, and they still can only achieve modest levels of performance increases. Early emulators were not what they are today. They were incredibly difficult to set up and use.

EDA companies saw the opportunity and poured lots of R&D into overcoming the weaknesses, ensuring good return on investment, and have managed to keep it scaling with design size increases. But what, if anything, will dethrone emulation?

I covered many of the technical aspects in a story today, but part of me has a nagging feeling there is a big change coming to the industry that could upset the apple cart. It is not that there will be a better emulator or prototyping solution. It is that the fundamental way in which systems are built is changing.

No surprise that an increasing number of designs are becoming defined by their software. Software paradigms are changing to utilize new development platforms that are being developed both for cloud deployment and for the AI space. That means software will be easily reconfigurable onto any number of platforms, and thus timing dependance between them will have to be broken. Sure, there may be some low-level software, such as drivers, that are dedicated to each hardware platform, but the interface between hardware platform and software is moving up in abstraction.

The real smarts are now happening in the “compiler” path from those high-level frameworks onto arbitrary hardware platforms.

The architecture of the hardware is changing along with this. Today we are seeing amazing new chips being produced that are huge arrays of homogeneous or heterogenous processors connected by regular communications structures, often implemented as a network on chip (NoC).

This makes me think back to formal verification and some of the steps they had to take to deal with complexity. If you have a chip that is a 30 X 30 array of processors with a regular interconnect, you do not need to verify the entire thing — ever. You can reduce the problem and still have enough to evaluate every interesting property of the system. In formal, they called this abstraction. They will reduce the size of many things in the design, from counters, to the amount of replication, to the size of memory — all intended to make analysis easier and faster and to reduce the state space. Of course, care has to be taken that interesting behaviors are not precluded.

If software does become decoupled from hardware, then a lot more software verification will become virtual. Gone will be the need to perform runs on cycle-accurate hardware. Gone will be the need to evaluate the performance of the hardware relative to the software. Only the resulting performance that the software has on the hardware will matter. If that is not good enough, you will exchange the hardware platform or tweak the compile options to obtain the needed performance.

I am not suggesting the role of emulation goes away. But the need for ever-larger boxes may be ending. Software runs will be limited, and hardware abstraction (in the formal sense) will become a way to fit all necessary capability into modest box sizes. Abstraction in the modeling sense also will mean that increasing amounts of the problem can be handled using virtual prototypes coupled to emulators.

If we add new manufacturing trends into the mix, as well, chiplets turn what was soft IP into physical IP, and it should be possible to plug these into the emulator, just like the hardware modelers of 30 years ago did to enable real chips to be used with simulation.

There is a still a lot of ground for emulation to cover, and the recent forays into integrated analysis will ensure they can keep adding incremental value for hardware verification tasks. I don’t see the implementation abstraction changing in the foreseeable future.


Daniel Pane says:

I really enjoyed reading Clayton Christiansen’s book, the Innovator’s Dilemma. The only problem was that at the end of the book he applied his framework to the Electric Vehicle market, and got it totally wrong, uh oh.

Leave a Reply

(Note: This name will be displayed publicly)