Tools get better but verification gets more complex; tapeouts still delayed.
By Ed Sperling
Verification has always been the time-consuming part of designs. Even at 120nm and above, where power wasn’t much of an issue, verification accounted for an estimated 70 percent of the non-recurring engineering expense in a chip.
Since then, the tools to automate design have become more effective, but the complexity of designs has grown by leaps and bounds beyond those tools. As a result, verification still accounts for at least 70 percent of designs, and according to a study by the Synopsys User Group (see chart below), the problem is getting worse. Some 69 percent of tapeouts are late.
There are two extremely thorny issues that engineers have to deal with in verification these days. One is the problem of verifying multiple power islands. The average cell phone chip has anywhere between six and nine different power islands, and those islands often use different voltages and run in different modes—on, off or standby.
The trouble occurs when those islands have to be verified, because many times that verification is state-dependent depending upon their mode. If all of the islands are on, the results can be dramatically different than if they are off, completely burning through the power budget and either generating too much heat or drastically reducing battery life.
“Verification of power domains is extremely difficult,” said John Koeter, vice president of marketing for the solutions group at Synopsys. “There are significant power differences depending upon the power management options.”
The second issue is verifying low-power IP, which has become particularly attractive in complex designs. Gary Delp, distinguished engineer at LSI, said that should change sometime in the near future with the introduction of the IEEE 1801 standard for the design and verification of low-power integrated circuits.
“Standards allow us to view things in abstractions, which means a decrease in complexity with more predictable results,” he said. “Power, timing and area have all become more complex. What we’re doing is drawing a trust boundary around proven IP with interfaces. A key piece is to tie the 1801 standard into the simulation environment and pull IP-XACT support into tools.”
Bridging the IP-XACT with the low-power verification world requires a leap of faith in design, however. Both of those standards also have a connection to the Transaction-Level Modeling 2.0 standard, which is another black-box type of technology. Some of the work already is done for systems designers, but trying to figure out exactly what’s happening can be unnerving to an engineer who has been able to understand the effects of every action and interaction until now.
The approach with 1801 is to establish a safe, well-tested zone, and then to work within that zone.
“What you’re doing is designing to an interface and incorporating that interface,” said Delp. “You’re also adding consistent semantics for things like corruption, isolation and level shifting, and separation of power, intent, configuration and implementation. All of that is being layered and structured for implementation checking.”
It doesn’t help that two languages are dueling for dominance in the verification space—the Verification Methodology Language supported by Synopsys and ARM and the Open Verification Methodology supported by Mentor Graphics and Cadence Design Systems. Both are currently working on supporting low-power designs, but industry insiders say neither has solved the low-power design issue.
“There are two big differences between VMM low power and OVM low power,” said John Decker, solutions architect at Cadence. “With OVM, low-power is always on. There is a problem if they get out of sync.”
On top of that are the battles between Accellera and its followers with its Unified Power Format, and Si2 and its adherents with the Common Power Format. (see related story????) Until these issues are resolved, either through one standard emerging as the de facto winner, or until the differences can be bridged or wrapped, this will create added complexity in areas such as IP compatibility and low-power verification.
The bottom line: Things are getting harder, and while standards and tools are getting better and more comprehensive, it’s debatable whether they’re improving at the same rate as the rise in complexity.
Leave a Reply