Verification As A Deterrent?

Complexity in design and algorithm translations are holding up the works; no simple solution on the horizon.

popularity

By Ed Sperling

Verification is becoming more than a bottleneck in semiconductor design. It’s actually deterring companies from adopting the latest techniques for saving power or building certain features into chips.

The problem is one of complexity, and it’s getting worse at every node. While the tools exist to do complex designs, there are the classic tradeoffs of area, power and performance coupled with business decisions about cost, break-even volume, time to market and reliability of the overall design.

“This is all cause and effect,” said Rob Aitken, an R&D fellow at ARM. “You may have formal methods to link four systems together, but that doesn’t mean it will work well with eight systems. And it gets harder with power islands. Verification of that stuff is very hard. The design is hard enough to begin with. Verification is keeping people from using those ideas.”

Adding complexity into a design also adds many more corners for verification and raises questions about just how complete are the verification coverage models. “If you can’t answer the question of how you know if you did something right then you can’t do it,” said Aitken.

Even at existing nodes problems are erupting. Algorithmic-intensive designs have always been problematic, but the utilization of more algorithms to save power and improve performance doesn’t translate well from the language in which those algorithms were created—usually Matlab—to RTL.

“This is showing up in everything from cell phones and video to security,” said Brian Bailey, an independent verification consultant. “They all have algorithms, and most algorithms start off life in Matlab. They’re pure floating-point algorithms with no concern about how they’re going to be implemented. The creators of those algorithms are just trying to figure out what can be done and whether it will actually give them what they want.”

From there, it has to be translated. “You can’t just throw this through high-level synthesis,” said Bailey. “It doesn’t work like that. A reference algorithm may be written in floating point. You have to convert it to fixed point, and when you get to overflow, rounding and truncation you can break the algorithm. You have to constantly go back and ask the question, ‘Have I broken the algorithm?’

One solution is to keep test benches in Matlab, which makes it easier to truncate inverse functions and then find standard deviations. But there is no foolproof verification method at the moment, and that poses a serious problem. There are some efforts that have been taken, so far. Flows from Cadence and Mentor use C++ to define certain portions of the algorithm and the fixed point conversion can run in C using those flows without a simulator. And multiple sources say Synopsys has been working on its own solution, but the company will not comment on unannounced development.

The only comment from Synopsys was that algorithmic design is indeed a growing problem that needs to be solved. “You can get the algorithm done quickly, but from there you need to recode everything to get a full validation of the algorithm,” said Chris Eddington, director of marketing for Synopsys’ high-level synthesis and system-level products. “Going from Matlab to RTL may take three to six months. The real problem is re-verification of all of this. Just to validate the prototype of the algorithms can take three to four months.”

Until this gets resolved, however, and until verification makes more strides toward matching the architectural design on the front end, many developers say privately that design activity may be limited on all but the highest-volume designs where it pays to work out these kinds of issues by brute force—lots of engineers, lots of testing, and lots of trial and error.



Leave a Reply


(Note: This name will be displayed publicly)