Power Estimation: Early Warning System Or False Alarm?

Experts at the table, part 3: The experts discuss what it will take for software to become more power aware and the progress we can expect over the next year.


Semiconductor Engineering sat down with a large panel of experts to discuss the state of power estimation and to find out if the current levels of accuracy are sufficient to being able to make informed decisions. Panelists included: Leah Schuth, director of technical marketing in the physical design group at ARM; Vic Kularni, senior vice president and general manager for the RTL power business at Ansys; John Redmond, associate technical director and low power team lead at Broadcom; Krishna Balachandran, product management director at Cadence; Anand Iyer, director of product marketing at Calypto; Jean–Marie Brunet, product marketing director for emulation at Mentor Graphics; Johannes Stahl, director of product marketing for the prototyping at Synopsys and Shane Stelmach, associate technical director power and reliability solutions expert at TI. In part one the panelists discussed the current state of power estimation. In part two, the discussion moved to where power estimation is being used today and where it is needed. What follows are excerpts of that conversation.


SE: Software has a huge impact on power and yet we hear that many power modes do not get used until several versions of the software later. What can we do to get software more involved?

Redmond: There is an IEEE standard (IEEE P2415) that is specifically trying to address that issue. It is trying to bridge the hardware/software gap. Its goal is to provide a language that the software team can easily use.

Kulkarni: In addition to power, we have to start providing energy values. Power over time is very important for handheld devices and apps are killing it. They are optimized software-wise for power. 2415 is one of the good starting points but we need an UL type stamp for energy efficiency of application software. allows you to run various scenarios, especially of pieces like the OS that enables them to be made more efficient. Next it has to extend to apps that consume power because app developers don’t care. There should be a power meter which will show how much is being consumed by each app. Hardware hooks are necessary to make this possible.

Schuth: We talked about hardware and software people not knowing each other, but in many cases the software people are external. This means that we have to do more education. The standard is the right way to take it.

Stahl: I don’t think that in the software world, standards are the driving force. Defacto standards and open source drives things a lot more. Software people have a debugger and a target that they are developing the software for. If the debugger shows them what they want to see, then they use it. If the debugger showed them all the power pins, the power states and would present it to them nicely, then they will be happy. Typically it is undocumented and they don’t see it, so they forget about it, or it comes later.

Brunet: Most developers only care about functionality. There is no concept of power and that is a problem.

Kalkarni: There are various techniques for doing power and thermal simulations and the output of the tool can help drive power policies.

Brunet: We are going through a cycle where power is becoming important in many different ways. It is like 1994/95 where we started talking about timing driven placement. Then, the frontend and backend teams did not talk to each other. It took five years before they came together.

SE: How much progress can we expect to see in the next year?

Stahl: We will have a standard – IEEE 1801 – that will define how we can describe at a high level, IP blocks with their power states and that will allow us to migrate the spreadsheets into models. It will allow the IP industry to leverage models so that a user can populate their design with all of the information available. This will elevate power analysis to a higher level and be fairly accurate because it will be driven by power models.

Iyer: First, designers will stop complaining about the accuracy.

Brunet: They will still have problems with power.

Iyer: Yes, they will still have problems.

Stelmach: I would like to see entire tool flows where the flow considers power to be a concern. Emulation does not always consider power to be in their domain, and these things cross all domains. We may have IEEE 1801, but still have a lot of EDA tools that have no idea how to interpret them.

Balachandran: Two things will happen. EDA vendors will have to stop doing a point tool approach. Today, there may be three different approaches for power estimation used and one of them may provide a completely different number from where you started at the beginning. There has to be a more unified approach so that power numbers go from the beginning to the end. And it must correlate with sign-off, because if it doesn’t – then it is pretty useless. The second thing is that they must drive what-if exploration. To know where a problem is, you have to analyze a piece of code and this means you can make suggestions as to how to improve it. This is the natural progression for tools.

Kalkarni: From RTL to physical implementation and power grid integrity down to package, we continue to dedicate 170 R&D guys to the task. Things get connected and it is not about one isolated step. We have to provide solutions that get tighter and tighter around handoffs and continue to close the loop.
Redmond: Unfortunately, we will still be using spreadsheets. But I think the EDA tools will be more integrated and comprehensive. This is what we need. Today we have models that do not interact and different levels of abstractions that do not play together. We do not have a model that can take different levels of abstraction and aggregate them up. The new standards will start to make this easier. It is going to make it a solved problem but today it is a barrier.

Schuth: I think they are all being very optimistic. Spreadsheets are a proven solution with 30% accuracy. Strides will be taken, but I don’t think it will be fully adopted. There may be new flows that integrate more things together, but it takes a while to turn the crank and get it proven, distributed, adopted. It will probably require new models to work within it. It gets back to education. The more people that are aware the better things become.

Iyer: We taking the right steps and moving in the right direction. By this time next year we should be closer to the final goal.

Leave a Reply

(Note: This name will be displayed publicly)