Avoiding Chip Melt

Assertions become far more important in preventing thermal problems, even if they do require a lot of code. But are they accurate enough?

popularity

By Ann Steffora Mutschler
Assertions. Just the term conjures images of writing boring lines of code to feed into a simulator. But for engineering teams working at the 40nm node, the pain of making sure their verification is complete and accurate is real—and so is the potential for literally melting silicon if something goes wrong. With this in mind, ‘boring’ goes out the window and gets replaced with ‘necessary.’

Assertions have a long history in verification, noted Krishna Balachandran, director of low power verification marketing for Synopsys. “They emerged and became popular about 10 years ago because there was a need to improve the verification productivity. You had tools dealing with the back-end flow that were constantly beating on performance, which is the same with simulation. We try to improve the performance to simulate things faster. In verification, there’s a design/verification gap. Designs are growing faster than verification tools and technologies are able to keep up without putting the undue burden on the number of engineers required for verification or the number of computers required to verify a design. Assertions were a way to boost that verification productivity.”

The driving force behind assertion usage is accuracy. Increasingly the engineering teams that are building power awareness into their designs want to know how to build this into their environment. But they also want to be sure, given the NRE costs and all the rest, that its going to work, observed Adam Sherer, product management director for Cadence’s Incisive simulator tool and secretary of Accellera’s UVM committee. “At 40nm and below, the chips are just going to melt. The industry can’t afford to do anything else. The chips will not close without this—40nm is about the transition point where this really becomes acute.”

Assertions are used primarily to validate the behavior of a design and are also used to provide functional coverage information for a design. They can be checked dynamically by simulation, or statically by a separate property checker tool such as a formal verification tool that proves whether or not a design meets its specification. But there is also some confusion about what exactly a low-power assertion really is.

“The term ‘low-power assertions’ probably can mean different things to different people. From my perspective…an assertion is a particular kind of way of making a statement about something—in particular, a statement about behavior of a design—such that certain values will appear on certain signals at certain times,” said Erich Marschner, product manager for Questa Power Aware Verification at Mentor Graphics. “Typically you use it to define sequences of conditions over time. It is also used more generally to mean checks that are made as part of the verification process. While not all checks are functional, some of them are structural or apply at different levels of abstraction.”

For example, when a design is divided into power domains, those domains must be able to interact correctly. That involves looking at the power states of the system, which domains can be in what power states at what time, and whether there is any isolation or level shifting required because of two domains that are interconnected but in different power states at the same time, he said.

Cadence’s Sherer noted that the bulk of projects today still aren’t using advanced power techniques. “For the companies that are using any of the power aware structures—frequency modification, voltage levels, power shut-off, or any of those techniques that are very focused—all of those have very specific triggering activities. They have specific signals that set up the condition by which the power is going to change and to set a recovery from that power condition. That’s when assertions start to come in because there are two aspects that our users are very concerned about. One is, for a given power domain, is that being affected correctly? Did I set it up correctly? Do I recover correctly? What are the input and output signals? Are those being properly sequenced? And that is a key thing—that it’s properly sequenced.”

For an engineering team that may have just one power domain, their first power shut off can probably manage by hand, he said. “The assertion is good, but you can look at the waveform tool and you can probably figure it out. But if you have three or four power domains and they are overlapping and some of the signal triggering is coming from software, some of it’s coming from hardware (obviously they are all manifest in hardware), now you have an interesting dynamic that may go beyond casual observance in a waveform tool or human proof point.”

Power-aware simulation tools contain a collection of these checks, with a large number of the static checks done by analyzing the structure of the power intent that’s described in the CPF/UPF and comparing it to the power states that are defined in CPF/UPF. The power states report what states various domains will be in, the structure of the design dictates which domains are connected to which other domains and other parts of the UPF specify whether isolation or level shifting should be inserted in certain places.

By comparing all of this and analyzing that information, the engineering team can tell whether the isolation and level shifting has been inserted in all the right places for all the possible power states that have been defined in UPF/CPF, Marschner explained.

“To verify an assertion you really have to have full functional information about the design. One of the interesting problems with low power is that, depending upon the level of integration, you many not have all the information necessary to do the verification—or at least the static analysis of all the possible behaviors, which is usually how assertions are used in a formal context. This is especially true if the low power activity is driven by software ultimately,” he added.

What’s Next?
To leverage the full strength of assertions and their role in a low-power/power-aware design methodology to improve the accuracy of verification—particularly below 40nm where things get really painful—the industry must bring some pieces of today’s verification technologies rather than continuing to look at verification as a standalone process.

In-situ verification with power intent could be a way to go in the future in terms of bringing different technologies together, suggested Vic Kulkarni, general manager and senior vice president of the RTL business group at Apache Design.

In the case of Apache’s RPM technology, he explained that it was disparate groups within the front end and the back end that did not talk to each other creating power budgeting issues. “By bringing a technology that fuses these two worlds together, it brought the front end to influence the back end power delivery network and power integrity and so on, which essentially helped the designers trying to optimize their power grid, for example, instead of over-design or under-design. Similarly, one can think of a scenario in the verification world where people who are doing day-in, day-out verification, the classic verification companies or business units, have to start crossing the boundaries and bringing the UPF and CPF world and the designers intent world together to create the next generation of low power design methodology.”

In the meantime, noted William Ruby, senior director of RTL power product engineering at Apache Design, customers are putting together power regression methodologies. “You used to have functional regressions. Even the power intent CPF/UPF-driven is still kind of functional in nature. But power regressions are all about power consumption. The idea here is that you want to track your power consumption early on in the design cycle before synthesis, especially as you freeze the RTL and then you start fixing functional bugs. Going through the functional verification process by fixing a functional bug, you may introduce a power bug and if you don’t catch it. The power bugs we see are not going to magically fix themselves in synthesis or place and route. At the end of the day you’re going to get a nasty surprise and that’s not something you want to see.”

Whether the vectors are being generated automatically, pseudo-randomly or special testbenches, engineering teams are throwing everything they’ve got at power consumption verification. And for now, that’s about all they can do.



1 comments

[…] Ann Steffora Mutschler Assertions are key to complete and accurate verification, as I dove into here, and there are implications for IP as […]

Leave a Reply


(Note: This name will be displayed publicly)