Power Management Verification Requires Holistic Approach

Experts at the table, part one of three: Putting power management verification into context; power format challenges; shifting from point tools to methodologies.

popularity

Semiconductor Engineering sat down to discuss power management Verification issues with Arvind Shanmugavel, senior director, applications engineering at Ansys-Apache; Guillaume Boillet, technical marketing manager at Atrenta; Adam Sherer, verification product management director at Cadence; Anand Iyer, director of product marketing at Calypto; Gabriel Chidolue, verification technologist in the design verification technology division at Mentor Graphics; and Prapanna Tiwari, senior manager, static and formal verification products, verification group at Synopsys. What follows are excerpts of that discussion.

SE: What is the best way to put the topic of power management verification into context in today’s design process?

Shanmugavel: Low power is not a static step in the flow — it’s always a looking glass — so everything you are doing today, if there was a low power glass you could put on it, that’s what any design team has to look at. And that’s easier said than done.

Iyer: I agree with you in that sense, but I also disagree with you because even though low power needs to be looked at that way, people never look at it that way.

Boillet: That’s where we have something to do to help the designers. I agree we have to look at the problems through different lenses now that we see power has an effect on DFT, has an effect on CDC, has an effect on power optimization — so it’s up to us to provide the infrastructure in order to have RTL instrumentation or even ESL for those an look at the CDC problem with the power element, the DFT problem with the power element.

Iyer: I would go one more step to say it goes through the flow.

Shanmugavel: We need to look at power as a holistic approach. It’s not only just power estimation to understand how to estimate power and reduce power but also the impact of power on all downstream flows when we are designing our chips — all the way from high level power estimation at the system level when you are putting together the complete product, you need to make sure every chip or every IC is adhering to some type of power target and as you put in those different power targets, it impacts the thermal and how to verify the thermal; it impacts the electro-migration; it impacts your dynamic voltage drop, which is your performance of your product. So, it is a holistic approach. It is said that any mobile product, we cannot carry more than 3 watts of power in it – if we carry more than 3 watts of power it leads to some kind of heat dissipation and the usability becomes a problem. So everything is powered down right now. We are looking at power in terms of design for power methodology, and not just design with power in mind. It is a complete paradigm shift in terms of power design.

Sherer: The challenge though, and we do profess a top down approach, but the pragmatic reality is that we have IP that has long lifespan that has to be reused. We have existing customers, we have design teams spread worldwide and we need to start working with the content that exists, and move customers from what they have in hand into these top-down flows. We can prepare on the tool side, and we have to. But we have to give engineers a path. Intuitively they understand the top-down, we have to show them a bridge.

Chidolue: That’s where being able to bring in a separate way of describing a power architecture that you could then use to look at those high-level views, your IP, as you just talked about — to start early.

SE: How do the power formats fit into this?

Iyer: That is another thing – the power format has evolved, but it has fundamental issues even today. In my experience as a designer, we used UPF but after that whenever we changed our design, we had to actually add things to the UPF to just verify – nothing to do with whether the design is correct or not; just make sure the verifications are correct. That used to slow down the process, we used to have in-house tools to add this into the design.

Chidolue: I understand what you are talking about — probably one of the reasons for that is because a lot of that work was done at the gate level. A lot of the power optimization as you just talked about from the gate level, everybody within a given company knew how to go about adding those gate level artifacts based on their own libraries, based on their own flows, but once you got the separate, new thing called UPF, I don’t think that they had known how to work with that level of abstraction.

Sherer: We provide customers, as part of our standard documentation a holistic view, end to end on how to work with power format. That does not mean every technology in that flow takes the same …

Iyer: I still challenge you that even if you provide that holistic view, the problem still remains. With power you are talking about a physical problem that you are trying to translate.

Shanmugavel: I kind of disagree with that one because it’s not just a physical problem that you have to deal with at a physical side. UPF and CPF are power formats, they represent power intent but what can be done is, with CPF and UPF, we can also estimate power early on in the architecture phase — I think that is very important. As people are ready to get very close power estimation at the RTL stage, which is closer to the gate level, we are enabling the architects to design lower power products with functionality in mind, and as opposed to gate-level reductions.

Tiwari: The central problem is the format gives one piece of intent, the RTL gives us another piece — it’s just one more data entry for us. The central challenge that low power brings into play is now piece in the flow has to know what is going on. I have to know, at simulation stage, what my synthesis is going to do with that power format file, and I’d better be doing the exact same thing or my golden reference at RTL is bogus. Synthesis better be doing the exact same thing in formal, P&R — every one has to have the exact same understanding. To that end, the whole power problem is not about one tool or one format or one spot in the flow — it is about the whole platform. You have to look at it as, is my entire design flow — can I do a low power aware formal check that is aware of CDC. And we are going more and more for this realm where we cannot be best in class point tools sitting and saying, ‘I just do synthesis,’ or ‘I just do simulation,’ you figure out what you want.

SE: Are there limitations to the power formats?

Boillet: I think in the power intent formats right now, there are limitations. For instance, DFT concepts are not really taken care of. I’m sure we’re going to get there. In the meantime, we need to provide a solution and ensure people can still do their design and overcome their issues when you put power and DFT together. There are things that can be done to simplify the problem. For instance, having the power hierarchy matching exactly to the logical hierarchy are things that are helping the tools that we have in implementation avoid issues. So, with RTL restructuring it is something that can help in the meantime until the industry comes with a standard that covers all aspects.

Shanmugavel: When you are designing for power, and the moment you have RTL being designed, how do you take that power format and translate it to the next level, which is physical design. Once you go past implementation, how about verification? We need to fill in all those pieces. For example, when you are designing low power RTL, once you have captured your overall power, it can be represented in something called an RTL power model. RTL power models can be brought into the physical world, where you can actually perform power simulations on the physical side or dynamic voltage drop simulations or lifetime simulations on the physical side. Once you’ve simulated your IC, you can generate a chip power model that can be used for system-level simulation. So you start with architecture, you go into the RTL, and from the RTL you deliver a power model, from the chip you deliver a chip power model to the system, so you need to have a complete ecosystem — all the way from architecture to your product — and not architecture to GDS.

Iyer: I will take a contrarian approach and also closer to the designers approach. From my previous experience we used several of these models and they never worked. As a designer what I want is a tool which can actually do a correct by construction — how can they ensure that? Instead of estimating, instead of the verification challenge. Today, you do want simulation and then you don’t even bother if the functionality is correct. Along the way, you probably do some kind of logic equivalency checking but all the way to the end you don’t really worry about the functionality.

Shanmugavel: Power has different levels of abstraction. When you look at the RTL stage, it is what we call as architecture aware power estimation. Based on what type of architecture power we estimated, once you go into the micro-architecture stage, it becomes cycle accurate, and then once you go into the gate stage (synthesis level) it becomes gate accurate, which is the most accurate level of power.

Sherer: I have to agree with Arvind because part of the challenge is that people writing the power format are the designers and they are writing it once without any concept for the rest of this flow so we’ve had to take that interpretation and make the best of it in tools that were never designed to handle that. Simulation is one of them, and system-level is even further away.

Chidolue: At Mentor, what we’ve been doing actually is to look at ways of capturing some of that information at the gate level and feed that back up to your system models because you’re right; what we’ve seen in the past are that those high level models are not accurate enough to allow you to make those kind of tradeoffs we are talking about at the system level. The only way you can do that is to get that information somehow.

Iyer: Regarding the exploration, the things we can do, first of all, in terms of power things need to be broken down into smaller pieces and they need to be explored independently of each other….Can you explore all the power gating options up front, before even going into writing a power format and going through the flow. That’s a critical piece in this puzzle.

Tiwari: What we are realizing is that we cannot do simulation without knowing what static is going to do, what formal is going to do, what CDC is going to do, even with linting, we are realizing that I can do the dirtiest of the dirtiest of RTLs, I can do a lint that can actually make life a lot easier for the guy who is going to write the first testbench. Do you see this kind of interplay?

Chidolue: I would agree with that. We are beginning to see things like that similarly at least from the functional verification side of things — it’s not enough to just consider simulation — you’ve got to consider CDC, you’ve got to consider emulation to do the system-level…

Iyer: You do what is best for you. If we take that approach, and not predict what the other guy is going to do, probably things are going to be ok.

Sherer: Customers that have relatively simple power domain structures, small numbers are ok. But companies that have brought together 50 or 150 power domains where the IP is unknown, cannot possibly work in this range. They can’t. They’ve had to put in place the lint checks.



Leave a Reply


(Note: This name will be displayed publicly)