Second of two parts: Truly revolutionary changes in EDA are not likely for a variety of reasons…or are they? The answer depends on whom you ask.
On the surface, revolutionary changes in EDA seem unlikely due to the risk of replacing costly tools, flows and methodologies. But are they really? The answer depends on whom you ask. For Part One, click here.
Risk is a big part of the equation here. “There are always pioneers in an organization and what you need to do is find someone who is willing to take some risk and typically it’s on a design that’s important but not mission critical,” said Wally Rhines, chairman and CEO of Mentor Graphics observed. “And the problem is those people who are willing to take risks are the most talented designers, and they’re usually tied up on the mission-critical projects. So it’s really hard to get somebody to give a serious evaluation to something new that has both the willingness and the competence to try it out—and sometimes work through the problems that exist in a new approach.”
He noted that in seeking out the right risk takers, he and his team look for pioneers within a company as well as problems that are challenging those groups. “The best of all cases is when a design group has a critical program that they absolutely cannot complete with the existing methodology or, unfortunately, when a big disaster happens – and we’ve had some of those. The company that did not use our physical verification software had an electrical overstress problem where they were running leads over thinner oxide and not knowing it and it led to a big, big product failure. That’s when the door opens to new capabilities.”
Oz Levia, vice president of marketing and business development at Jasper, agreed. “It inevitably starts with trailblazing customers, and it may be different ones in different generations. But if you look around and try to think back about the first users of the random constraints, the first users of , the first users of , the first users of synthesis, the first users of any type of evolutionary methodology that eventually produced a revolution might not be the same company but it’s always somebody that is trailblazing. It’s a strong company that sees technology as a strategic advantage, sees EDA as a strategic advantage, not as a cost center, and it’s taking risks in order to gain a competitive advantage.”
It quickly becomes apparent for either startups or big companies that latch onto innovation and produce a result. Moreover, once the economic benefit is shown, it’s very difficult to stop, he said. “That is the case with many of the successful evolutions – whether it’s timing analysis or synthesis or emulation or RTL or RTL simulation, gate-level signoff – all these things that today look obvious weren’t there and somebody took a risk.”
However, Saleem Haider, senior director of marketing for physical design and DFM at Synopsys, pointed out that revolution also can be approached by leveraging the look and feel of existing tools so the user doesn’t notice a difference. So it looks evolutionary, but actually isn’t. With its latest version of IC Compiler, however, “the internals – how we approach the solution – that is very much revolutionary. I don’t know in the totality where it falls, but if you look at the concept of revolutionary, the idea has resonance with the people. A main part of that is a revolutionary solution. The reason it’s attractive is that it will enable a quantum leap in productivity and value and what you can do. There also then is an implication where the revolutionary solutions and revolutionary introductions that somehow requires everything to be very different, and that’s the part that I’m questioning whether that indeed needs to be the case every time. Certainly in our case, we managed to keep the interfaces the same while providing a quantum leap forward, which will in some way revolutionize how people are doing physical design because they’ll be able to do things at a level that’s not really feasible right now.”
Still, if you want a revolution, make sure you pick the right side, said Bernard Murphy, CTO at Atrenta. He noted a couple of attempted revolutions that haven’t worked out, in the vein of grand unification theories of SoC design are implementation from SystemC and implementation from IP-XACT.
And as such, he recommends keeping in mind that:
Further, as the industry continues along pushing the curve and moving forward on what amounts to this fairly standardized synthesis place and route flow, the cost of revolutions goes up, noted Rob Aitken, R&D fellow at ARM. “It might have been easier to create a revolutionary flow in 1998 than it is now, but the interesting other angle is that it goes back to the Gordon Moore quote of, ‘No exponential is forever but we can delay forever.’ We are succeeding in delaying it for some time but eventually we will not be able to delay it anymore, and at that point the cost of revolutions will start coming down again.”
Here, a lot of it simply is risk reward. “Most people would prefer to take a smaller risk for a known reward than a massive risk for an unknown possibly negative reward,” said Aitken. “Nobody ever got fired for advocating synthesizing their next design. That means that the promising new technologies lurk there and they wait for somebody to do something with them that can’t really be solved any other way. An example of a technology that we see there at this point would be 3D integration, where there are a lot of things that you could potentially do with stacked logic die but so far none of the things that you can do warrants the risk of trying to do them when there is some alternate way of using an interposer or something that allows you to accomplish something similar at slightly less risk. But if you look at the imaging chips, they are figuring out various clever things that they can do with stacked die that they couldn’t do any other way that allows them to actually build better imagers. They are starting to make those kinds of things happen, and the act of doing that in the imaging business eventually may make the technology less risky in the logic business. If you go back to the beginning of this and say, ‘How am I going to sell a revolution to my boss,’ the way to sell it is revolutionary results but that you are not actually doing anything revolutionary.”
To answer the question directly of how do reduce the risk of new methodology and/or new tool adoption, Levia concluded, “I’m really not sure because I don’t think there is any one entity that is smart enough to conceive, ‘This is a new evolutionary methodology. We’re going to invent it. We’re going to drive it through and we’re going to reduce the risk.’ The successful ones are almost always done in conjunction with trailblazing customers and driven demand from the semiconductor side.”
A good question to ask is “as a chip designer, where do you see the need for revolution”
We also need to watch the difference between revolutionary EDA solution and revolutionary chip design. ICC2 might be revolutionary in terms of how they implemented things, but so far all of the feedback I’m seeing on it does not seem to imply it is revolution in how it enables the end customer to get chips out the door. (not trying to pick on SNPS here, just using them as you did… I could fill in the blank with most other “revolutionary” tools I’m going to here about at DAC) Can an EDA solution truly be revolutionary if it does not enable revolutionary chip design?