Evolution Vs. Revolution

First of two parts: EDA vendors insist tool and design flow changes need to be evolutionary. What if major advances could be realized with new tooling? Would users accept that?

popularity

In the electronic design automation industry changes to tools and flows are nearly always evolutionary. They hide as much change from the user as possible, allowing easier justification from an ROI perspective, and they raise far fewer objections from users, who don’t have to spend time learning how to use new technology or rethink tried and true approaches to problems.

Revolution in chip design has happened before, observed Rob Aitken, R&D fellow at ARM. “Logic synthesis was definitely a revolutionary way of doing design. That required a massive change in the way that people did things. It also required a changing of mindset because initially the idea was you could synthesize but your results wouldn’t be quite as good as somebody sitting down there by hand. Over the course of a few years people came to the realization that it was actually better than something you could do by hand.”

Aitken believes another revolution is possible with EDA today, but there are some things that would be problematic. First is the huge investment in the existing flows. “There is expertise out there and there are massive amounts of scripting and infrastructure built around existing flows so any disruption to that would require rewriting fairly large chunks of that infrastructure. Another challenge to a revolutionary flow is signoff. Right now there is an agreement between design companies and foundries in terms of when we signoff a design, using this flow, with this methodology and these models, and we declare that now this will work. Anything that fails this is essentially a yield problem and needs to be fixed, and anything that violates timing could be a design problem, and so on. There is a set of arrangements that exist so a revolutionary design flow would require changing that. That’s not impossible but the benefit of it would have to be there to justify making that effort.”

One of the areas where the difference between evolution and revolution is most apparent involves stacked die. A 2.5D configuration is evolutionary, while a full  with logic on logic is revolutionary. And just to muddy the difference, there are gradations in between that have some evolution and some revolution, such as the Hybrid Memory Cube that combines a thin layer of logic with multiple memory die.

“When you look at a 2.5D design, you can build that with an existing design flow,” said Mike Gianfagna, vice president of marketing at eSilicon. “Xilinx did that. It’s incremental. When you look at full 3D, there’s a whole raft of new issues that you never had to deal with, such as stress management, thermal management and chip-level density.”

Testing on 2.5D likewise is relatively straightforward because there are exposed contact points. In a vertical stack contact points for the tester may be hidden, which requires an entirely new test methodology. In some cases built-in self-test can suffice, but in others it requires a convoluted signal path test starting on one side of the package and ending on the other.

“The key is being able to access the die,” said Stephen Pateras, product marketing director for silicon test products at Mentor Graphics. “If it’s a separate die or separate wire bonding or interposers, that’s simple. If it’s one die to the next, or the TSV, the problem is similar. But the key is being able to access the die.”

And therein lies the difference, at least insofar as test is concerned, between a 2.5D and 3D stack, and the definition of evolution and revolution. While the tools and approaches may be evolutionary, the actual application of those tools requires a radically different approach.

Technology adoption
Convincing users to accept new technology can be challenge, observed Frank Schirrmeister, group director for product marketing of the System Development Suite at Cadence, who created ‘Schirrmeister’s Law’ along these lines. “Schirrmeister’s Law says the likeliness of a new technology to be adopted by a project team is inversely proportional to the number of changes you ask the project team to do. That’s why revolutionary technologies are hard.”

He said that when Virtual Component Co-Design (VCC) was launched, it required users to change everything. “All of the software developers would have to use a new modeling style and a new model of computation; the hardware developer and the IP developer would have to express their IP in a certain way, and if you do all that life will be perfect. Well, a funny thing is we even brought together the right people…But it turns out that’s not realistic to change all that. To ask a team to do all those revolutionary changes is very hard.”

This goes back to the issue of certain technologies being ‘good enough’ to take over the previous technology in a more evolutionary manner, Schirrmeister said. “You have to start taking up some of the capabilities of the previous technology. For example, if your design is small enough, if it fits into one FPGA, if all you need to do is some basic software debug an FPGA-based prototyping might be fine.”

The way Hamendra Talesara, verification lead at design house Synapse Design sees it, to some degree revolutionary changes are happening today. He believes technologies such as SystemC and TLM 2.0 quality are revolutionary. “It’s really market-driven. Since the market is small, a very small subset of our engineers has the SystemC-TLM 2.0 expertise to support the segment, but that is necessary to support and grow that. As we are penetrating and converting more customers, there are more folks who are getting trained in this area. It really is a function of the design partner to do whatever the market requires. We try to be at the leading edge, but not necessarily at the bleeding edge.”

If an entirely new flow is too much to bear, perhaps it would be more palatable to look at revolutions in parts of the design flow, namely the system-level space.

Aitken believes there is some truth to this. “If you were going to look for a place for the next revolution I think that would actually be a good place to look for it in terms of taking something higher level – high-level synthesis or whatever your favorite aspect of converting something that is at a higher level of abstraction to RTL and then having that go through the normal design flow. There is scope for revolution there but to me it’s very reminiscent of where logic synthesis was in the 80s – it was something that kind of worked but it didn’t do everything that you wanted it to.”

“It seems to me that the challenge is that there are a number of things that you can do with hand generated RTL that so far automatically generated RTL is not as good at doing – certainly this is important business to ARM. We have all sorts of people who spend their days optimizing RTL,” he added.

At the end of the day, given the fact that today’s EDA algorithms are running out of steam, the industry will eventually reaching a tipping point and need to embrace new tooling.

Part two of this series will address ways to reduce the risk of new tool adoption.



1 comments

[…] On the surface, revolutionary changes in EDA seem unlikely due to the risk of replacing costly tools, flows and methodologies. But are they really? The answer depends on whom you ask. For Part One, click here. […]

Leave a Reply


(Note: This name will be displayed publicly)