What’s Missing In Low-Power Verification

Consistent methodologies need to be developed, tools need to be power-aware, and all of this has to be connected into design flows—quickly.


By Ed Sperling
Ask two engineers what low-power verification is and you’ll likely get the same checklist that includes confidence in the overall design, good coverage, a long list of corner cases, and other items in a checklist. Ask them how to reach that goal you’ll almost certainly get different answers—or maybe no answers at all.

Power has emerged as a ubiquitous concern in design, whether it’s for a smart phone, a data center, a smoke detector, or a smart sensor developed for the Internet of Things. But for all the advances in functional verification, power-aware verification is literally a moving target based on power modes, power islands, use models, inaccurate estimates, mixed methodologies, inconsistent tool versions, and an increasing amount of black-box IP that often has gaps in characterization.

“Nearly everyone has a different approach to low-power verification,” said Erich Marschner, product marketing manager at Mentor Graphics. “There are some design teams that have approaches that predate standards. There are some that focus entirely on isolation and not multiple voltages. Others add constraints into the design tool.”

And then there is the issue of which version of a tool verification teams are using at any particular time, which may not be the latest tool because a project already is in progress. That means some tools are power-aware while others are not, and the methodology has to shift to accommodate that inconsistency.

What, when and where
Just knowing which tools to use, when to use them and at what point in the flow currently is more art than engineering. Most of the companies that have been working with low-power design techniques have developed solutions internally, and frequently for a specific high-value project. At best, those same techniques work for derivative chips, but they are not suited for commercial availability.

Commercial tools are becoming available as EDA companies develop them. These are frequently power-aware versions of existing tools. But integrating them into flows—particularly those populated with tools from multiple vendors—isn’t so easy.

“The number of power domains is increasing, more things go on and off, and the number of power states is going up,” said Bhanu Kapoor, president of Mimasic, a consultancy specializing in low power. “In 2001 and 2002, tools were non-existent in this field. We now have a power format standard (IEEE 1801/UPF) and some tools are getting more mature. But there’s still a lot of learning going on. There are structural solutions, formal solutions, and now more general-purpose checks, and we’re even seeing more simulator support recently.”

But the tools still don’t always do what they’re supposed to do. Kapoor said some formal tools are still being written as if there are no power domains. And while simulation/emulation is more power-aware than in the past, Atrenta CTO Bernard Murphy said understanding power in emulation remains “a bit of a black art.”

“If you estimate power, you accumulate information across every gate,” he said. “The problem is that emulators are not good at sniffing out problems. And power goes counter to that.”

Perhaps even more challenging is that power represents an elbow in an otherwise linear progression in IC design, which is a collection of incremental advances built on other incremental advances. With power, many of the advances are project-specific. While this has always been a challenge at the leading edge of design, where solutions are custom-built based upon testing, good engineering, and a fair number of educated guesses, the success of EDA has been to automate much of that learning and simplify it. As the rest of the industry catches up, those approaches become better tested, tweaked, improved and commercialized.

Power is much more complex and harder to generalize. A design with four power islands has literally thousands of things that can go wrong, and a design with six power islands has many thousands more on top of that. Add in use cases, different voltages—and with advanced designs, dynamic frequency voltage scaling—and the number of possible outcomes and possible errors becomes a very large number..

This is no longer just a problem at the leading edge, either. The mainstream of the semiconductor industry is now dealing with these issues for the first time as they move from 65nm to 40nm. For many of them, power gating and power islands are brand new.

Team building
What’s also new is the growing recognition that effectively managing power is a team sport. It requires various levels of expertise, from the architecture all the way through to the verification. Moreover, that team has to be embedded in the rest of the design flow to be effective, because changes in one area could affect everything from placement to manufacturability.

“There are three groups in the low power teams doing separate things,” said Lawrence Loh, vice president of engineering at Jasper Design Automation. “There are those working on the power architecture, those doing the design engineering work, and those doing the power verification. Out of all of those, the biggest hole is finding a reliable way to verify the chip because of low power. Low power can break a design structurally or during transitions, and you need to articulate as many of these problems as possible. And the biggest challenge in all of this is functional verification.”

Adding power-awareness to functional verification is where the problems really begin. The question is how many power domains are in a design, and that number can be quite large depending on how much internally developed and commercially purchased IP is included.

“The challenge is that low-power is so complicated that using a stimulus-driven approach to try to break it is almost impossible,” said Loh. “Those techniques are not scalable. The golden functionality is what you have before low power. You need to understand the design even better with low power to take low power into account.”

Help is on the way…eventually
So how big is the problem? Consider that the most complicated designs these days use more than 100 power domains, and that even basic designs are using at least four. This requires far more than just analyzing on and off states. It means understanding how those states affect the functionality of other parts that may turn on and off while another part is on or off. This is pretty evident with a phone call coming in while a person is watching a video or looking at e-mail. But it also applies to a set-top box that’s receiving an update while a person is searching for movies and updating the network settings, or a car infotainment system that may be handling multiple functions at once.

“Even with four power domains, you need to understand what are all the legal and illegal modes, which can mean hundreds or thousands of functional tests,” said Adam Sherer, verification product management director at Cadence. “And there is confusion about what constitutes real verification. Most customers do states testing and modes testing. But you need to look at thousands of lines of power intent to debug, and then you need to test the design editing changes and re-link that back to the design.”

He said Cadence is focused on making simulation more power-aware so users can at least understand which errors to ignore, but that it will take time to integrate that capability throughout the rest of the design flow. “For now, you need a good low-power plan and you need to learn how to debug low power.”

Michael Sanie, senior director of verification marketing at Synopsys, had a similar perspective: “The tricky part is when you start dealing with voltage islands because during shutdown and bring-up you create a lot of Xs (unknowns) in simulation. Some are overly pessimistic, some are overly optimistic. And typically people ignore them through RTL and then fix them, which is too late.”

Sanie noted that one solution is to do voltage-level simulation, which is where Synopsys has focused some of its recent effort along with power coverage and power debug. “The whole verification process is now getting aware of power,” he said. But it will take time before that is fully integrated into the design flow.

Education needed
All of the major tools providers are now offering Webinars and training classes for low-power verification. The reason is that the entire semiconductor design industry will have to be educated on this over the next few years.

“There are very few people doing designs older than 90nm, and most are now hitting problems of low-power design,” said Mentor’s Marschner. “In another couple
years everyone will be at 65nm and below, and they will have to work with low-power verification. That’s part of the reason we’re seeing an increased interest in power-aware coverage, which causing an increase in power-aware verification. The next step is to create a more consistent methodology as different users begin working with this.”

Fusing this all together into an integrated multivendor solution will take even more time, though. The problem is so vast, with so many permutations, that just being able to simplify a piece of the problem is an enormous challenge. And with the growing reliance on third-party IP in SoCs, that adds yet another huge unknown into the mix—particularly IP that is being re-used from one design to the next and which probably didn’t need to be power aware in the past.

“IP customers don’t want to mess with the functionality of the IP, but they do want to make changes for power and performance,” said Nikhil Sharma, vice president of engineering at Calypto. “A lot of these changes are at RTL, and they could include everything from memory banking to block-level clock gating. But not all of this IP had support for power intent. So what happens when customers suddenly realize they have to squeeze every last milliwatt out of it? They have no choice but to use power gating, which has an impact on the design, synthesis and the verification flow.”

He’s clearly not alone in that assessment.

“There are no deterministic ways to deal with this,” said Atrenta’s Murphy. “IP vendors see this as just another complex dimension of variability. The applications developers don’t want to deal with all the complexities. And the chip companies are trying to simplify this as much as possible.”

The result is that no one is dealing with the problem all at once. But work is underway across the EDA tools industry to make the entire tool flow power-aware because all of the major vendors see a big opportunity there. The big questions are when that work will be complete and what standards will be necessary to support it.

For further information, a number of vendors have created Webinars and videos on this subject, some of which have been archived, and most plan to devote future user group sessions to low-power verification:

– Mentor Graphics: https://verificationacademy.com/courses/power-aware-verification

– Cadence: http://www.cadence.com/cadence/events/Pages/event.aspx?eventid=764

– Synopsys: Check the SNUG calendar of events.

– Jasper: http://www.jasper-da.com/resource-library/technology-videos

– Calypto: http://www.calypto.com/en/downloads

Leave a Reply

(Note: This name will be displayed publicly)