The New Frontier: Low-Power Verification And Test

Low-power verification and test strategies have been in development for years, but there’s still a long way to go.

popularity

By Ann Steffora Mutschler

By now there’s no argument that verification and test strategies must be considered at the very earliest stages of any design cycle, and when it comes to low-power designs, the advanced techniques used and design complexity make the challenges here even more daunting. Low-power verification and test strategies have been in development for a number of years, and it’s a constant evolution.

When it comes to test, there are two general aspects to dealing with low power design, explained Steve Pateras, product marketing director for Mentor Graphics’ silicon test products. “First is testing in the presence of a low-power design. When you are applying test you need to be cognizant of the fact that it is a low-power design, and you need to deal with that. The other one is testing the specific low-power features of the design. Those are two different things.”

In the first place, designs are being created in such a way that they need to work under certain power constraints and certain modes and tests must adhere to those constraints. “That’s been a very big problem and probably getting worse with things like 3D because of power and heat dissipation. Something we’ve been doing for a while is running low-power test in such a way that they are not drawing lots of power,” he said.

Barry Pangrle, solutions architect for low-power at Mentor, said the starting point for low-power verification and test has been addressed in a number of ways. “Certainly power formats and work that’s being done with IEEE 1801, for example, make it possible for a lot of the test automation products to be able to take that information about the power intent and power management that’s on the chip and help create structures so the test actually work.”

He pointed to a keynote during the recent ISLPED conference given by Uming Ko, a distinguished fellow and chief technical advisor at MediaTek. Ko’s presentation dealt with cell phone markets and that with a lot of these chips they don’t want to have them sitting on a tester for too long—a maximum of two seconds. “It limits much testing you can do. It becomes a cost issue, as well. It becomes a tradeoff between trying to figure out how to make sure I’m testing all the modes that I have to.”

Along these lines, Pateras noted an uptick in advanced low-power techniques such as adaptive voltage scaling and dynamic voltage and frequency scaling (DVFS) being used where the voltage and/or frequency is being changed for certain parts of the design based on functional activity, in order to reduce power. “Either you reduce the clock frequency or you reduce the voltage or both together. They scale together. We are seeing more and more of this, so the challenge for test is that normally you test under one condition. But now, of course, with these dynamic techniques you wouldn’t even be able to test that all these are working, and working under all these conditions.”

To account for this EDA vendors are adding the ability in tools to include various frequency and voltage corners and to apply and create tests under all these various conditions.

With the amount of focus on low-power design, one may think the automated tools can address all of the advanced techniques, but this is not necessarily true. Pete Hardee, low-power design solution marketing director at Cadence said it’s definitely not completely fully automated.

“The way that you want to specify that power architecture has to be under the designer control,” Hardee said. “It starts off life by specifying what those domains are. You’re using a power format to split the design into power domains. You’re defining what the power domains are, and the design elements that live within those power domains. Then you’re going to specify what is supplying that power domain,  so you specify whether or not the supply to that power domain is an always-on or a switched Vdd. You start off specifying at that level. Then there is a whole bunch of refinement that comes into it, and that refinement can be supported by the tools.”

A piece that’s not completely automated—and another set of design decisions that have to be made—involves how many power switches are needed and how they are connected, he said. “It sounds trivial, but typically there are hundreds of power switches needed to switch a sizeable power domain, and they need to be chained in a particular way that gets a really good balance of the power on and off ramp. That’s critical to success.”

Further, Hardee noted that the UVM (universal verification methodology) is becoming the standard that people are using for testbench automation and the whole verification methodology. “We’re seeing the low-power UVM and low-power UVC (universal verification component). We have a methodology that you can create UVCs, which are components of the testbench. The low-power UVCs drive the design through these mode transitions that you want to verify. Combine that with the metric-driven verification and our power-aware simulation, and we have all sorts of low-power assertions that are automatically generated from the power intent file. All of this stuff is coming from the master specification of the power architecture and the power intent file. There’s a lot of automation we’re putting into verifying these complex low power architectures.

Hardee referred to a panel discussion at DesignCon earlier this year with Fred Jen, engineering director at Qualcomm. “He was emphasizing to the audience the need to keep power architectures as simple as possible. Don’t introduce domains and states just because you might need them later. Keep it as simple as you can. It’s because of all of these reasons I’ve been talking about—the exponential relationship between the number of domains and the verification task.”

No small task

Making sure the hardware will work with the software that will run on it is a growing challenge. There are examples galore of the failure of this in smartphones: applications that crash, needing to reboot often, power draining before our eyes—the list goes on. Yes, there were advanced low-power techniques employed in the design, but it isn’t enough. The more complex the techniques you’re going to be using in your chip, the less automated they are to verify and test, noted Cary Chin, director of marketing for low-power solutions at Synopsys. “That’s really talking about the design side; I think on the verification and test side it’s even more of a complex ballgame. For example, on the verification side what you really need to do is very, very difficult.”

Still an open issue is the whole area of how much verification is enough. “We can barely answer that question on the functional verification side because it’s not really practical to completely statically verify all of the things that are going on, on the chip. Then when you throw on top of that the complexities in the power architecture, the problem with fiddling with the power both on the design and the verification side is it was one of those fundamental assumptions we used to make in design. I think that’s a huge ripple in the process and that’s still rippling through tools and design methodologies,” he continued.

Low-power verification is just beginning. EDA companies are just starting to understand the problems that need to be tackled and what can be effectively automated.

“We tend to figure out there are ways to do it but it takes time,” he said. “In the near term what we’re seeing is that it’s pretty difficult to guarantee when power is changing dynamically if you’re doing things like dynamic frequency changing on your chip. Those things are very difficult to verify across changes in process and variation. There are all these things that interact with each other and effectively increase the number of dimensions of the verification problem. And when you are talking about a dimensional thing, the growth in complexity is exponential with regard to actually being able to exhaustively verify everything, so it’s impractical.”

Chin believes engineering teams are taking as many shortcuts as they can and that there will be some interesting examples in the next few years of areas where it just didn’t work out. As a result, some fancy new features in a chip won’t even up being used because of the problems it will create in certain particular modes of operation.

“The automation is coming along but it’s just not 100% there yet,” he said. “Just like DFT was in the past, low-power verification and test is a new frontier. Everything from basic modeling to thinking about coverage – all of that stuff is worth looking at again.”



Leave a Reply


(Note: This name will be displayed publicly)