Abstraction Aging

Do engineers find comfort in details as they get older, and is abstraction for younger minds?


During the course of doing interviews for my article on system simulation and abstraction, I spoke to several people who, just like myself, had started their career pushing abstraction. At the time, we were all frustrated that the industry didn’t move fast enough. The advantages of abstraction appeared to be so clear. Everyone developed slides showing that the cost to fix bugs increased the further you went through the process, or that the biggest optimizations were to be made are at the highest levels of abstraction. People still create those slides today, and they are true, except they do not factor in cost.

We have really had one and only success when it comes to an industry-wide change of abstraction. This was when digital design moved from the gate level to the Register Transfer Level (RTL). It was fresh young graduates who took up the cause with RTL synthesis and showed that they could create designs just as good as the most experienced gate-level designers, and they could do it in a fraction of the time.

Cadence invested large sums of money into higher levels of abstraction but obtained very little commercial success. University research drove the push to C-level modeling and SystemC, but adoption of that has not seen the same kind of success as RTL. It has obtained some. This is primarily for creation of virtual platforms and high-level synthesis has proven to be an invaluable tool for some types of block.

I started my career developing and promoting the notions of RTL modeling. Before that, everything had been at the gate level and synthesis was still many years away. Most of these models were being used for board-level design. Chip design remained at the gate level because of the lack of synthesis. The migration from gate to RTL was difficult for many people. They created RTL models that were less efficient than gate level models – basically because they were trying to describe gates using RTL constructs. It was difficult not to laugh at times.

With RTL well established, I spent many years looking at higher levels of abstraction. But no matter how good new technology looked, the industry turned away from it. In part, the problem is that proposed abstractions above RTL are too large of a leap, such that it becomes difficult to bridge the divide. It makes it difficult to relate what comes out of a tool to what went in. That makes people uncomfortable.

An abstraction means that some information is missing, and too many times what was missing was important when being used for analysis. The accuracy loss approaches the fidelity of the model, which means that you can no longer make reliable decisions at the higher level and guarantee the quality of what would be produced downstream. If there is any chance that what you thought was an improvement, as recommended by a tool, turns out to have made things worse – nobody is going to use the tool.

At the same time, another thing was happening in the industry. Abstractions started breaking down as we went to finer nodes. For example, RTL expected that inter-clock timing could be abstracted away and that static timing analysis would find any problems that might limit frequency of operation. It was an RTL problem, it was solved at RTL, but that was only until wire timing became significant. Now timing became dependent on placement and what had been a waterfall process now became more complicated. The same is true when doing power analysis. Clock distribution accounts for a significant part of total power and this can only be approximated at RTL, let alone at higher levels of abstraction.

We are seeing another breakdown in abstraction today. Timing used to be abstracted from the R and C on a wire. That is no longer true for some circuits and some technologies where the L becomes something that can kill your design. Each of these new technological problems, created by progressing to newer nodes, breaks the abstractions and takes a lot of work to fix them.

Magdy Abadir, VP of marketing for Helic, had a similar abstraction migration. “I used to be promoting abstraction and doing everything there, but here I am today, leading a company that is pushing in the other direction. As you get older, you tend to concentrate more on details.”

Abadir also looks at some of the recent security issues within processors and has some thoughts about that. “All the problems, in the various disciplines, have been in areas that are not visible when you start from a high-level model. These abstractions do not fully define everything that happens.”

The industry continues to be pulled in both directions. New technology nodes are forcing us to consider greater levels of detail. At the same time, the growing complexity of systems means that no one can fully comprehend what is going on within the systems and thus abstract analysis is essential. Concurrency is not something that the human mind is good at dealing with and thus we must rely on tools to help with that analysis.

The limiter has always been cost. The cost to create models, the cost to verify them, the cost to maintain them. If the value of what is gained by having them does not swamp those costs, then they are hard to justify. At some point, systems will become so complex that they will become essential, but so far, we have managed to find other ways around the problem.


Jim Bruister says:

Very good “thought” article. I’ve been an advocate of the EDA industry moving up the abstraction level. This gives a good explaination of why it hasn’t happened.

Leave a Reply

(Note: This name will be displayed publicly)