One Flow To Rule Them All

Flows have always existed within EDA, and several new flows are emerging today. Some are worried that those flows may be more closed than they would like.

popularity

The new mantra of shift left within EDA is nothing new and first made an appearance more than a decade ago. At that time there was a very large divide between logic synthesis and place and route. As wire delays became more important, timing closure became increasingly difficult with a logic synthesis flow that did not take that into account. The tools subsequently became tied much closer together, and today they are almost considered to be part of the unified flow that we call physical synthesis.

Meanwhile, power has been increasing in importance. It now impacts everything in the flow, starting with physical effects through RTL, into the ESL space where architectural tradeoffs are being made, and out into the software world where power policies are often established. This top-to-bottom, cross-cutting concern is leading some to think that flows are going to become closed again, with little ability for smaller companies to play with point tools.

People who have been in the industry for a while may see this as a back-to-the-future moment because this occurred when design frameworks were created. The idea at the time was that people wanted fully integrated tools that all worked off a single database.

“The CAD Framework Initiative had the correct model,” recalls Drew Wingard, chief technology officer at Sonics. “However, business interests pushed the EDA companies in a different direction. The request from users wasn’t to have a single flow. It was for interoperability amongst tools and in a structured framework. EDA vendors chose to implement that in such a way that their tools were massively preferred and it was difficult to integrating anything that was foreign. This meant that few people would do it. That model does not work.”

Will we ever get to the point where the entire flow has to be treated as a unified step? “A unified one-step flow that converges in reasonable time might be possible in a few niches,” says Bryan Bowyer, director of engineering for Calypto, “but most hardware design requires human creativity and innovation at many steps. A flow with multiple tools and clean hand-off of data will scale more easily and give a more predictable flow.”

Bowyer provides an example: “If a design fails to meet timing due to a long route that cannot be fixed by changing placement, what should you do to fix it earlier in the flow where there is no placement and only coarse timing?  A tool can get this right most of the time, but you still need a designer to step in for the most difficult problems.  If you make the tools too ‘smart’ and integrate them very tightly, then the designers end up spending most of their time fighting with the tools.”

Within the physical implementation flow, openness has become an absolute necessity. “The practical reality is that no EDA supplier is the expert in being able to create an entire flow,” says Michael White, director of marketing for Calibre physical verification at Mentor Graphics. “So the industry relies on defined interfaces where data can be exchanged between different steps in the flow.”

It all comes down to standards. Do standards lead or are they created once the industry solution has matured? “Standards have to solve a problem well enough that people can rally behind them, and they have to have enough implementation work behind them so that people can understand how to deploy them,” explains Wingard. “De facto standards are successful because you have an implementation first. SystemC may not have reached its initial goals, but the provision of the SystemC kernel along with it made it a lot more attractive than the other alternatives that were around at the time.”

The problems right now are more technical than business. If we consider power, there have not been enough people interested in the problem so that all of the technical solutions that have been created have been adequately tried. The technology is still very young. In addition, the solutions that have been created for the applications processor market, the driving force of power optimization today, may not be the solutions that the broader market may want.

Part of the problem comes down to abstraction. This topic was discussed in more depth in “Abstraction: Necessary But Evil”. “RTL dealt with zeros and ones, but suddenly with the concentration on power, the concept of voltage becomes important and RTL has no way to deal with that,” says Krishna Balachandran, product management director at Cadence. “So there needed to be a way to describe the different behaviors that could happen at different voltages, just like in an analog world. That was not contemplated in Verilog or VHDL. They could have gone back and added it, but nobody wanted to do that and there is so much legacy surrounding them and so many tools in place that they didn’t want to touch that. So the decision was made to go outside of them and that is why a new language was created.”

But it is not just power that is upsetting the legacy languages. “On the standards front, a lot of stuff in the hardware description languages needs to be abstracted better and old dysfunctional representations replaced with new ones that can handle things like power, variability, dynamic voltage and frequency scaling, clock domain crossing and asynchronous design for digital systems,” says Kevin Cameron, principal at Cameron EDA and a consultant at Silvaco. “Given the large amount of inertia the existing methodologies have, that may require a shift to more open-source solutions and maybe an open-stand approach over the closed IEEE SA-entity approach currently employed with SystemVerilog.”

The market appears to have three flows in development or usage today. The back-end flow is fairly well defined, the power flow is emerging and a new verification flow is just getting defined.

The back-end flow
The back-end flow has been defined and used for quite a while now and is at least somewhat open. “Because of the size and complexity of designs, companies want and need to use the best tool for any given application,” says White. “This does not imply that integration is easy, but we are all market driven and if our end customers demand the best-in-class tools for their entire flow, then they make everyone work together.”

OpenAccess (OA), the Cadence system, was donated to Si2 and this is available to everyone. Synopsys also makes LEF/DEF available to the market for integration into their flow. But not all users are happy with these standards. “If OA was truly open-source that would be a good start, but currently it’s not something non-Cadence companies like to use,” says Cameron.

Part of the reason for this is because, in the case of OA, Cadence takes on the responsibility of extending OA with any foundry requests that come through for any new technology advancements, such as multi-patterning. Making this type of change in a standards committee might be more difficult and time consuming.

The power flow
For power, there are two ways in which integration could be done. You can have people plug in using a vendor’s API, or you can use a particular format to describe the power intent.

“When one company looks at more pieces of the flow, and one such flow is power, they are able to do a better job,” says Balachandran. “You cannot just look at power at one stage and forget about what is happening downstream, and for the same reason you cannot make assumptions upstream that do not comprehend what will happen downstream. Given this, who has the best opportunity to make sure that you get the best power number from your spec? The answer is someone who can look at it in the totality.”

Balachandran acknowledges that they and others are treading somewhat in the dark. “We are responding to the needs of the day rather than coming up with a grand scheme and creating the ideal constitution for power. We are making it up as we go. Today it is getting so complex that we are now saying that working at that level of abstraction is not good enough and we need to be able to plan things ahead of time. To make it worse, it is not just about the system. It is the chip, package and board, and we have to study the effects of what happens in the package and board in the context of the chip so that it all works together. Thermal effects are also coming up.”

And that is where abstraction rears its ugly head. “The only way to make abstract power modeling be consistent with what comes next is to make abstract power modeling accurate,” says Wingard. “Getting it all from one vendor is not going to help. There is some information that can be shared, but we would do much better by trying to agree on some relative standard of describing things such as activity or a way to export a power model up in the abstraction stack from a prior design.”

Tools are needed for both directions — analysis, design and optimization in a downward direction and estimation in the upward direction. “Some parts of the flow, such as power estimation, may need a unified step where block-level analysis at the transistor level can be leveraged for full-chip-level analysis on that base of accuracy,” says Amit Nanda, vice president for global marketing at Silvaco. “But for an entire design flow, losing the ability to make abstractions will result in increased complexity and risk above and beyond the time penalty, making it an unlikely path.”

While the steps in place today have been driven by the needs of the market, additional requirements are now being added and attempts are being made to build those on top of the standards that have already been defined. Multiple standards groups within the IEEE (IEEE 1801, IEEE P2415 and IEEE P2416) are tackling pieces of the problem, and a coordination person has been defined to try and ensure they stay in line with each other.

Are we about to make the mistakes of the past with a power flow? “EDA companies have learned from their past mistakes,” says Balachandran. “There is plenty of room for innovation but any new company has to be cognizant of the fact that there are major EDA companies whose tools are being used for significant parts of the flow and they have to correlate with that.”

The verification flow
Unlike the EDA design and implementation flows, the functional verification flow is fundamentally not about transforming models into different representations through the flow. “One effort that has had some beginnings of traction is the of Accellera,” says Wingard.

“The key vision is that a single abstract model can be used to generate stimulus, checks, and coverage for every stage in the verification flow,” explains Tom Anderson, vice president of marketing for Breker. “The standard is still under development and the exact specification format is under discussion, but it is clear that a graph-based scenario model of the verification space provides the necessary level of abstraction. From this one model, EDA tools can generate test cases appropriate for verifying ESL and HLS models in virtual platforms, RTL models in simulation, gate-level models in simulation, acceleration, emulation, and FPGA prototypes, and even fabricated silicon in the lab.”

The portable stimulus standard holds the promise of defining a single verification flow that will transcend tools, platforms, abstractions, and vendors to reduce engineering effort and close coverage more quickly. And people already are looking at how it can be used in additional ways. “While power modeling is not high on their list of things they are trying to attack, it is the right vein in that it makes you think about some aspects of measuring activity that are highly related to how we verify stuff, and use cases such as performance analysis and power analysis,” says Wingard. “We need to capture that in a form which could be used in lots of different environments and different design abstractions. That is the better model.”

Balachandran is thinking along the same lines. “You have RTL switching activity, which can be generated from a simulator or emulator, and this is a by-product because they have lots of verification stimulus and they run it and get the activity files. This can be turned into a power profile that can be viewed at RTL. You can see which clocks are consuming the most power, what areas can be shut off some blocks, and use that to make decisions based on the stimulus. At RTL you may have identified which stimulus is good for peak power, but how do you translate that into a gate-level test today? If would be great if portable stimulus provides this, because today people have to run gate-level simulations again and the stimulus has to be re-written again.”

It sounds as if portable stimulus may be creating exactly the type of open flow that people are looking for. “Verification engineers and embedded programmers who today are spending time re-inventing the wheel at every stage of verification can be redeployed to develop revenue-generating apps and other software products,” says Anderson. That sounds like a win for everyone.



Leave a Reply


(Note: This name will be displayed publicly)