Getting The Balance Right

Making the trade-offs between high-level abstraction for power measurements and meaningful, lower-level measurement.


Defining the power architecture for a low-power design means striking a balance between the high-level abstraction and measurements made typically at RTL and below, but today that is easier said than done.

“The balance is that at the high level of abstraction, the design choices you make have a big effect over power, yet your ability to measure them is incomplete until you get much further down the design flow. That’s a balance that people have to strike and it tends to be a problem,” said Pete Hardee, director of solutions marketing at Cadence.

What works best at the high level of abstraction is the ability to run real system modes and get real activity vectors, which are becoming increasingly important. It’s actually better to take that information at the earlier abstraction level when a lot of data can be run. Software can be run on a virtual platform or an emulation box, which provide activity data. “It’s important to understand the modes because of all the complexity—the different power modes that a system is in relating to all the different system modes that need to be covered,” Hardee said.

The other piece of the equation is the characterization of every time there is activity, every time switching occurs, and what that means at the device level in terms of power. The problem is that characterization often isn’t available until later on in the design process.

“RTL is a good place where those come together,” Hardee noted. “Above RTL, we’re often guessing at that. If we can get to at least a relative ranking of the various architecture changes you have in mind, then you’re doing really well. And that’s all the above-RTL or system-level guys are trying to do at that stage.”

Fortunately, derivative designs allow you to get a little bit better than that, because if a similar platform has been done, there is probably some good characterization data from a previous design so that can be used formally or informally.

For any level of abstraction, the most important thing is to understand the limitations of the model, said Cary Chin, director of technical marketing for low power solutions at Synopsys. “Models that are used as intended can be quite accurate, but accuracy tends to drop off quickly if the assumptions are not met. For example, a high-level model for computing dynamic power based on transition frequency might be very accurate when a block is in normal operating mode, but in some special power saving mode the assumptions might need to be specially validated or the model adjusted or extended.”

Exactly what are we measuring?
“When you are measuring power, you are doing two different calculations–in a certain amount of time, with a certain amount of load, how many transistors are flipping on and off. Each time you do that is the act of power. It’s a very interesting problem to solve because nowadays it’s not just performance. [It’s about] how do you do it at a high level so you can get an architecture before you go down to the details. You don’t want to try it and see,” said Kurt Shuler, director of marketing at Arteris.

Models are the way to go from the high-level, and are typically validated against simulation at the lower level, Synopsys’ Chin said. So a block-level IP power model could be checked against a gate level analysis to verify correctness in multiple modes of operation. Similarly, gate models are validated against circuit simulation, and so on. “At each level, it’s important that the validation be as exhaustive as possible (including some measure of completeness) in order to build confidence at the higher levels of abstraction,” he said.

This data is generally available, but the model accuracy varies as the model is tuned. “Determining an accurate and compact set of parameters for any model is the ultimate goal, but that’s easier said than done. We learn by experience, applying new information to refine successive versions of the model to achieve better accuracy over time. The usual tradeoffs apply—time vs. space vs. accuracy,” Chin observed.

Captured within those models are dynamic and leakage power.

“It used to be that you needed activity to measure the dynamic power and leakage power,” Cadence’s Hardee said. “What’s changed is that now we have leakage increasing in today’s advanced nodes, and that has led to techniques specifically to control leakage like power shutoff. You’ve got to remember that the leakage calculation depends on the system modes and how long the blocks are shut off for, and that has to be factored in.”

That can be done at a number of levels—running either system software on a prototype or system software on a previous version of the chip if it is a derivative. What you are looking for are typical usage scenarios, such as how long you are in each of the identified high-level system modes, and what’s on and what’s off. From that you can create profiles, which in turn can be used to measure dynamic power and to affect leakage power.

The software perspective
Considering power consumption from the software point of view, Marc Serughetti, director of product marketing for virtual prototyping at Synopsys, noted that open software platforms such as Android have unlocked smart phones to a worldwide community of open source software developers.

“While users clearly benefit, what is the impact on power and battery life?” said Serughetti. “Power efficiency is becoming a key issue for software developers, and important quality criteria for their software. This impacts all the layers in the software stack. All layers need to be well integrated from a power management perspective and all functional entities contained in these layers need to cooperate. The big challenge for software engineers is getting insight into how well the system is performing in perspective of power.”

Here, virtual prototypes are useful as they provide a means to access such information as long as the information is available from the virtual prototype model. To be sure, advanced low-power techniques will soon be ubiquitous not just in mobile designs but in all designs: consumer electronics, data centers, and many other areas. Once stable they are expected to be widely available.