Power Management Verification Requires Holistic Approach

Experts at the table, part 2: Biggest verification challenges; tool bugs; getting to 100%.

popularity

Semiconductor Engineering sat down to discuss power management Verification issues with Arvind Shanmugavel, senior director, applications engineering at Ansys-Apache; Guillaume Boillet, technical marketing manager at Atrenta; Adam Sherer, verification product management director at Cadence; Anand Iyer, director of product marketing at Calypto; Gabriel Chidolue, verification technologist in the design verification technology division at Mentor Graphics; and Prapanna Tiwari, senior manager, static and formal verification products, verification group at Synopsys. What follows are excerpts of that discussion. Part one can be found here.

SE: What are the biggest verification challenges when it comes to power?

Iyer: Going back to my design background, there are two kinds of verification issues that we used to face. One is generally that designers make mistakes, and there are issues with not having a proper understanding. The second one – that we used to face 80% of the time – was, one tool is trying to do something and the other tool is undoing that, and that creates new problems. There are two verification challenges here –one is general mistakes by designers, the other is mistakes by tools.

Sherer: We put our money behind this two years ago — we talked at DVCon about the ability to extract the low power model from simulation, then use equivalence checking to make sure it’s the same as implementation because both tools are ‘right-ish,’ and the ‘-ish’ part is the simulator is operating at an RTL level that was never intended to simulate what we’ve asked it to do and power format implementation tools do it naturally. So we’ve had misinterpretations and sometimes, yes, I will admit it was a bug in the simulator and we fixed it, but the equivalence checking helped solve this problem. We are not perfect, but if we don’t think about it in this context, the companies that are building complex IP with a long design chain can’t do it. You have to have the integration.

Shanmugavel: It’s also about productivity. If these technologies don’t understand the interplay, what I’ve noticed is that each design team then has to compensate for that intelligence with human intelligence. Somebody then has to say, my synthesis is doing this, my checkers are doing this, my simulation is doing that — I’ve got to make sure these reports are all synced up. So unless the tools — and by the way, verification today is all about productivity; we are seeing customers spend so much money on the verification step — alleviate all problems for the customers with the same resources, or even lesser so — you can do all of your verification without worrying about whether these tools are all doing the same thing, and provide proof that indeed they are all thinking the same way, you cannot do it on faith.

Iyer: All I’m saying is that things you are doing, we need to do best.

SE: Is that really possible? While verification is never going to be 100%, for power, how close can we get?

Boillet: Back to the complexity problem and the fact that we cannot have complete verification, I would disagree because there are different categories of checks: you’ve got the structural check that can be correct by construction and can be 100%, but you don’t cover the functional part. For the functional, that’s where the simulation comes in, and there are users coming in with crazy scenarios which are impossible for any simulator to cover, any verification tool to cover all with different scenarios. That’s why we also offer formal verification when it’s applicable. Little by little, with a combination of complete simulation, very high performance and formal — we may get there, and that’s the goal for the industry.

Sherer: I would describe it as more comprehensive than complete. Comprehensive is going to imply the steps that get us there; complete has a different inference. From a comprehensive perspective, preparing all the technologies is the appropriate thing. I think there is a layer that doesn’t exist today around the power format which is methodology and approach, and general things that we perhaps could all agree on that would suggest there are certain tasks best suited for certain parts of the technology flow, which then becomes inclusive not just to mainline EDA but everyone in the community. The standards body hasn’t done that — and there are unanswered questions in the mixed-signal and system level that still have to be done.

Chidolue: IP is an interesting one, that’s one thing that’s been looked upon by the current UPF standard. They’ve done a lot of work in that space to try and close that gap. I agree if you looked at the Accellera UPF standard it was just a flat description — how do you do bottom up? That was a challenge. You have to go to things like UPF 2.0 to be able to get anything close to a compositional flow.

Shanmugavel: People have been doing low power chips for decades but a systematic power management design and ASIC flow that is aware of those customer choices is five to seven years away, 10 max, once these formats started coming up. Think of timing verification when it started — at that point there was a problem of synthesis, and whether all these different steps were screwing up timing or not. Power is a lot worse because even simulation and verification has to deal with it. I’m hoping that eventually this whole thing will move towards a standardized flow. Right now we’re just focusing on standards and how we get all the input descriptions — but hopefully this will go towards standardization of methodology and we’ll reach a stage where everybody knows power.

Iyer: Complexity is going up. Power domains are going up. One of the generations of the GPU (I worked on) had 10 power domains, then to 300, now it has about 4,000 power domains in one single chip. What this means is the whole power design and power methodology needs to be elevated up because we need to put another layer of abstraction on top of it. That’s what we are trying to address; where we combine the functional description with the power description and then verify the whole functionality.



Leave a Reply


(Note: This name will be displayed publicly)