Why Implementation Matters To System Design And Software

Abstraction needs to be applied consciously, and users need to be aware of its shortcomings.

popularity

There has been quite some discussion in the recent past how well abstraction really works in enabling system design and verification. As I admitted in “Confessions of an ESL-Aholic” a while back, I have revised my view significantly over the years. While thinking originally of abstraction more as an panacea, it turns out that important decisions and analyses, such as for power and performance, require more accuracy that can feasibly abstracted.

So is abstraction dead, not useful? No, it absolutely has its place. It just needs to be applied quite consciously, and users must be aware of its shortcomings. In the context of power analysis, it was shown more than a decade ago that annotation of power information into virtual platforms can be useful. Specifically, defining states for the different power regions of a chip, estimating or measuring their power consumption per state, and annotation “back up” into virtual platforms allows for driver development of the power-management software. This development was even formalized back in 2011 with ESL reference flows to which EDA vendors contributed (see my article “Disruptive Ripple Effects from Implementation to Systems”). With software development that early, users can get a view of the impact of the embedded software on power consumption using the abstraction of virtual platforms.

Can the same be used for accurate prediction of what the power consumption actually will be? Nope. Not really. While relative assessment may be possible at this level, one needs much more accurate information for accurate power prediction.

SDS2015-VerticallyConnected

That’s where power analysis driven by emulation—at the gate level or RT level (RTL), not the more abstracted transaction level—comes in, and the recent introduction of our Joules RTL Power Solution excites me. The flow to connect emulation-created toggle information is far from new; in fact, we actually won the EDA Innovation award for it back in 2009 for something we called Dynamic Power Analysis (DPA). We have great examples from customers achieving 90% greater accuracy compared to the actual chip power consumption as reported by TI and most recently Realtek. In these cases, the annotation of power information from implementation actually happened from the gate-level information. With the most recent introduction of the Joules solution, the prediction if power consumption from RTL has become so accurate, that Palladium DPA is getting great accuracy even using just RTL, not having to go down to the gate-level.

Methodology remains important. DPA offers different types of toggle information that can be created from emulation runs. This is where abstraction absolutely works. By using more abstract weighted toggle counts, users can reliably decide on the window of interest at which peak power will occur. This can be part of a very long emulation run involving software, etc. Once the window of interest is found and defined, more accurate toggle information can be created for that window of interest and then be connected to the Joules solution for accurate power prediction, annotating information back into the RTL execution. Using this flow, power predictions can be made with much greater accuracy involving software influencing the dynamic power in the design. Given the ability to deal with larger, longer execution times, the predictions can now even extend into thermal analysis.

So is abstraction required for efficient system design? Absolutely! Is it a panacea that allows everything to be done at higher levels? No. Users need to be very conscious when and where to apply it. For power and performance, implementation matters greatly and must be considered for system design decisions!



Leave a Reply


(Note: This name will be displayed publicly)