New tools begin hitting market for power analysis and planning; more work still needed.
By Ann Steffora Mutschler
With design complexity always on the rise and an increasing amount of embedded software encapsulation in designs today, engineering teams need to be concerned with power consumption in the initial architectural design. The only way to do that is to model power consumption at the transaction level.
While power is typically estimated after RTL synthesis, the better approach is to model it and then add it into a library to help the design team quickly determine the best balance between performance and power consumption. In this way, the lack of structured high-level modeling capabilities hinders power analyses during early design phases. Power modeling also enables
development of the power-related software drivers, as well as efforts to optimize the architecture and power domains.
Technically, power models allow the user to see how their power might vary based on whether they are running, for example, an MPEG decoder or just reading email on their iPhone or taking photographs, said Sequence Design CTO Jerry Frenkil.
“They can see what the power consumption of different applications would be, and once they see that they can make some decisions about what they want to do,” Frenkil said. “Knowledge is power. This gives them the knowledge of what to go tackle. This is an area where there hasn’t been a lot of practical work done yet because it is so difficult to do. These models are really an [enabling technology]. They enable design teams to get an understanding of their power characteristics even earlier in the design flow. It enables them to see power characteristics under different application loads. It enables them to see the results of different design decisions before they begin RTL coding.”
Similarly, Frank Schirrmeister, director of product marketing for system-level solutions at Synopsys, views power modeling at the system level as a technology that allows the designer to instrument transaction-level models with power information depending on their different states. He pointed to specific power information that users have modeled in Synopsys’ virtual platforms as representative power parameters (‘kernels’), which are then used in power equations to calculate power. “We are flexible to support specific component characteristics like different states and such, which are then interactively changeable by users,” he said.
Power consumption numbers typically are delivered by semiconductor companies based on budget planning, estimations and measurements, Schirrmeister noted.
Power modeling mostly ad-hoc…until now
In terms of power modeling usage today, Schirrmeister said that interest and use of power modeling is increasing, but just like transaction-level modeling there is a tendency to use it mostly in bigger projects with derivatives, i.e. bigger product families.
Meanwhile, Frenkil said in terms of how power modeling is done today, “The simplest way to say it is that it is all ad-hoc. I would venture to say that if you talk to half a dozen different groups that have done high-level power modeling, you’d find half a dozen different approaches. Partly it’s because it’s an immature technological area where it’s really a vanguard item right now, and partly it’s because it’s not well understood. For example, one can create a power model for any circuit or any piece of IP rather simply but whether that power model is any good or not is another question altogether because there are multiple perspectives on goodness. One aspect of goodness is whether it is accurate. If it is accurate, to what degree and under what conditions? Those conditions can be operating conditions or they can be operating modes. One of the problems with models that people do use today is that they work pretty well for exactly what they created them to do but if someone comes in and uses the model in an ever-so-slightly different way, the numbers are way off. And that’s because the model wasn’t created for that.”
Contrast that with what the industry does with standard cells and Liberty models, which are considered rock solid. “Any way you use them, they are going to give you good results and that’s a big reason why they are so widely used,” he said. “Designers know they can depend on them. In our case, looking at these high level models, we know that if we create a model for a particular usage and if everyone knows it works well for that, but it is questionable how it works for other things, it’s not going to get reused. And then if it is not going to get reused, then from a management perspective it’s very hard to put much effort into creating the model in the first place. So we see that one of the requirements for these high level models is that they have to be sufficiently robust so as to be used in a variety of ways and produce acceptable results in each of those ways.”
He added there is no standard way of creating power models or even a list of best practices.
Synopsys has a different opinion. “Naturally, we at Synopsys in the system-level team focus on instrumentation of SystemC transaction-level virtual platforms in Innovator,” said Schirrmeister. “The support is part of the Innovator infrastructure, which allows engineers to define, set and manage power parameters as part of the models integrated in an Innovator block diagram. We gather data from lower level tools like Power Compiler, providing more accurate data once the implementation has progressed beyond the system-level. We see more and more support to derive power related information from hardware based prototypes and high-level synthesis.”
Innovator, then, is an infrastructure in which the designer can instrument models. “We have the transaction-level SystemC models there already and you basically annotate just like you would annotate timing and performance information. You are just annotating power information,” he explained.
Another tool in the market is ChipVision’s PowerOpt, which takes a high-level model, which is essentially a C model, performs high level synthesis on it and produces a power estimate, i.e., what the implementation will cost from a power budget perspective given a certain target technology the design is being implementing into.
Then, one level down, in the RTL domain, Sequence tools perform RTL power analysis and optimization.
Notably, and possibly changing the landscape in this area is Mentor Graphics’ Vista platform, announced at this year’s Design Automation Conference. Vista promises to perform comprehensive architecture design and prototyping, and allow users to model, analyze and optimize power at the transaction level of abstraction.
The Vista platform allows engineers to model power at the transaction architecture level using advanced power estimation policies long before an implementation becomes available, or annotate more accurate power behavior based on attributes of the technology process of the target implementation IP blocks.
Also, Mentor said its Vista platform is a “layered” behavioral, timing and power modeling design methodology coupled with the SystemC Transaction-Level Modeling Standard (TLM-2.0) supported by the Open SystemC Initiative (OSCI) and offers an advanced design platform that allows chip designers and system architects to make viable decisions on hardware/software partitioning and architecture structures.
With its advanced debug and analysis toolset, Mentor said users can verify system-wide functionality, analyze and optimize systems under realistic traffic loads, and adjust system resources for optimal performance and power. Users also can explore various voltage scaling and shutdown techniques and apply the most efficient power management strategies.
As a result, designers can ensure a cost-effective architecture with a suitable bandwidth that can carry the target application. Given the abstraction and fast simulation of the hardware representation, a model of the system can then be used as a virtual platform for early software development, analysis and validation, including the ability to profile power while executing application software, the company said.
Looking ahead, Guy Moshe, general manager of ESL/HDL design creation at Mentor, believes in the near future that IP providers will provide IP models with power models. “Today, it is very rare to find any power models in the market. The only things the IP providers deliver with [their IP] are some manual spreadsheets that describe, ‘If you are using this mode and this mode in your memory controller, this will be the power consumption.’ Everything is static. In the future, you will be capable to do everything dynamically.”
Moshe estimates the industry is about a year from this seeing this as reality. “We are not far away from that however, I think the current economic climate actually accelerates this path because first, they don’t have any more time to release intermediate products just to be fast. They need to be accurate; they don’t have the bandwidth to do many turnovers. They have to be first to market and successful.”
Leave a Reply