Experts At The Table: Low-Power Verification

Second of three parts: Time-to-market pressures; hundreds of domains; IP reuse and power; the need for a holistic approach to deal with power; thermal issues; the growing impact of power management techniques in mainstream designs.

popularity

By Ed Sperling
Low-Power/High-Performance Engineering sat down to discuss power format changes with Sushma Hoonavera-Prasad, design engineer in Broadcom’s mobile platform group; John Biggs, consultant engineer for R&D and co-founder of ARM; Erich Marschner, product marketing manager at Mentor Graphics; Qi Wang, technical marketing group director at Cadence; and Jeffrey Lee, corporate application engineer at Synopsys. What follows are excerpts of that conversation.

LPHP: What power-related issues are cropping up at the most advanced process nodes?
Marschner: As we go to more dense geometries, it’s not so much the lack of design know-how. It’s more of the turnaround time that affects our customers. Each new chip has to come out with a tighter schedule. It has to be faster, smaller, and with UPF and power formats, you have to be able to bring together a hodgepodge of scripts as one easily identifiable flow. So you can go from simulation through synthesis to place and route, despite the learning curve to get in there in the first place. And once you know how to describe the power intent you want to provide, it helps you get to the next project and the next node faster. You know how you should do it. With 1801 2013 we’re facilitating your design with a common language.
Honnavara-Prasad: The first problem that people encountered was the growth in the number of power domains. What that means is when they develop a system on chip they see an explosion in the number of power states.

LPHP: How many?
Honnavara-Prasad: At least 20 or 30, but sometimes hundreds. That isn’t uncommon. But it’s not just the number of power domains. It’s the granularity and control of these power domains. More and more of these are going to be under software control. It’s not just a hardware problem. It’s also a software problem. It has to be dealt with on a high level to make sure you are taking advantage of all the software features and implementing them in hardware. It’s a big challenge for verification, because you are never confident if you’ve verified the full chip. A common use case may be MP3 playback. It’s several seconds. A lot of this kind of complexity we didn’t see early on. It’s important to come up with a very robust verification methodology. We don’t have much time to experiment and improve it, so we take what we have and put it together. For that we need a very good IP methodology so that most of the IP can be re-used from one chip to the next. Think of it as a platform design rather than a new chip every time. A lot of IP reuse has to be considered. Also, you have to streamline from implementation to verification.

LPHP: How much of a problem is power for IP vendors?
Biggs: There are two parts to that. One is the inherent power in IP, and we need to communicate the constraints for that to our licensees who are going to configure and implement it. We also do hard macros, and in that case it’s our problem with the implementation and the fine detail of how we do the implementation and verification. The biggest thing we’re wrestling with is system-level power modeling. How do we present an accurate power model to our licensees when they’re doing system-level design so they can make reasonable tradeoffs with the hardware and the software at the same time?

LPHP: We’re starting to get multiple power domains, multiple voltages, state retention, power gating and isolation, and now we’re getting into DVFS and near-threshold computing. Power becomes a variable. What does that mean to the verification side?
Wang: For anyone who does low power, they have very good job security. People will continue to develop new low-power circuit techniques to reduce the power. But in the end, there is a verification problem. As designs become more complex with more low-power features, verification becomes a huge issue. It’s not just functional verification, either. It’s also electrical verification. Electrical failures can cause functional failures. That’s a unique challenge of low-power design. People need to use a holistic approach that includes hardware, software and formal techniques to get enough confidence in their power verification strategy.
Biggs: Another dimension is thermal, and as we go down to smaller and smaller geometries and finFETs, the power density is going up. Thermal is a consideration. No longer do you just need to model your power modes. You also need a bit of history about how long you were in the previous mode and how long can you be in this mode before you get to thermal runaway. That’s a whole new level of complexity.
Lee: Those are all issues affecting the leading edge of what we do with low-power design. But what bothers most people developing low-power designs today are not issues at the leading edge. There are just a few companies pushing that envelope. The biggest problem is just coming up to speed with adding power management to existing chips. That involves adopting new techniques than the flows people have already used. It involves some risk, because it means changing the way things are done and depending on flows people are not familiar with. That’s the largest problem most people are dealing with. It’s the adoption of power-management strategies at all, and moving in the direction of power management in relatively lower technologies as opposed to pushing the envelope.

LPHP: So what happens if you do nothing?
Lee: You at least narrow your market, and you may go out of business.

LPHP: Power is also one of those issues that starts at the beginning and goes to the end. Why is the power format so important to understand up front?
Marschner: Verification really should start up front, because if you wait until the end of the design process, it’s a lot slower if you’re dealing with gate-level simulation, for example. And it’s also much more difficult to fix if you find a significant problem. One might claim that the only problems you can find and debug easily are very local problems. And you can’t see the big problems that are going to affect performance or power management or power consumption of your chip. The whole idea of UPF is to move the power aspects of the design flow earlier into the flow so you can see what’s happening and think about what’s happening earlier in the design process. That way you can make the right decisions about power management or power partitioning or power control as early as possible in the flow. Today, we’ve moved it as far as the RTL stage. In the future, we’re going to want to move it even further up into the system level. But the whole idea is to think about power right from the beginning so you can verify what you’re doing, so you can plan for the implementation and assess what the impact will be in terms of power consumption.



Leave a Reply


(Note: This name will be displayed publicly)