Just keeping track of the different models, and updating them all is causing a problem; automation doesn’t exist inside most chip companies.
By Ed Sperling
Talk with any large systems vendor about power modeling and, with very few exceptions, they’re still using a mix of spreadsheets and lower-level models—no matter how far along they are in ESL adoption and in modeling other parts of an IC.
Power has crept up on even the biggest companies, which have never really figured out how to implement it into their design flows. For one thing, the tools are still evolving. But so is an understanding of how to effectively deal with it.
Smaller companies, meanwhile, are just getting a taste of how challenging this can be as 65nm and 40nm become mainstream process nodes. Density, shrinkage, and competitive requirements have made power a critical issue, and while many are used to dealing with power gating and multiple power domains, the complexity of multiple voltage islands, multiple states between on and off and different strategies for maximizing energy efficiency add a mind-boggling array of choices and complexity to designs.
It’s well known among companies in the mobile IC market—those that have the greatest history of dealing with power issues—that power has to be dealt with at the architectural level. What is less well known is that it requires adjustments throughout the design cycle, and the tools even the most advanced companies are using are a direct reflection of that.
“The only reliable level for measuring power has been the gate level,” said Barry Pangrle, a solutions architect for low-power design at Mentor Graphics. “Above that it’s a relative measure. But to take advantage of the 80% impact on power that you can have at the architectural level you want to take advantage of everything you can. For that most customers are still using spreadsheets.”
This approach has worked fine so far. Modeling can be done on spreadsheets as well as being automated. The problem is that it can’t be updated easily, and there’s no way of testing that the numbers are realistic as the design progresses. “You really want a sanity check throughout the flow,” Pangrle noted. “You estimated the block and you need to make sure it’s right.”
What if…
The lack of automation causes other problems, as well. Because most flows are automated to some extent, being able to update various parts throughout the design process are critical. Virtual models, for example, allow changes in software to be reflected in hardware. But updating models manually with a spreadsheet is cumbersome, made worse by the fact that the amount of data that needs to be added and updated on a regular basis is ballooning. Some libraries are now measured in terabytes.
“At 28nm and 20nm, you’ve got to start dealing with electromigration and other effects caused by heat,” said Aveek Sarkar, vice president of product engineering and support at Apache Design. “You need to create models to capture all of these effects, but these models also have to be consistent and they have to replicate what’s really going on at the electrical or mechanical level. You need to understand the parasitics using linear and non-linear models, and then abstract from there.”
Getting those models right is no simple task. And what happens when an IP block is replaced with another IP block, or a signal is rerouted from one memory to another?
“You need chip models that create power models,” said Sarkar. “That’s one of the top integration focus areas according to feedback we’ve received from system design houses.”
Power everywhere
But is one power model really enough? Power is a global issue, and it affects everything from the software that’s written to a virtual platform to the IP blocks that are being integrated into a design. There are two diverging issues. One is that the classic divide-and-conquer strategy is essential for being able to design and verify complex chips. The second is that power budgets need to be fixed, and they can be affected by everything from those individual blocks to the way they are integrated and used.
“Power modeling is key,” said Philippe Magarshack, group vice president for technology R&D at STMicroelectronics. “Otherwise we will never be able to tackle designs going forward.”
He noted that ST has been using dynamic voltage scaling for several process nodes, along with dynamic voltage frequency scaling. Power islands are well understood, as well. But automating the power remains a challenge.
“There are no standards for this,” noted Ghislain Kaiser, CEO of Docea Power. “This is a problem because we need a common way to capture this data and have the same kind of modeling. The most important thing is to get power models into the design flow.”
And because power generates heat, primarily through leakage, thermal models need to be developed in sync with those power models—something that will become critical as stacking of die becomes more mainstream over the next few years.
But internally developed spreadsheets have reached their limit for adding new data. There literally are no rows and columns left for more data. And existing TLM 2.0 models are too far removed from the power/heat to be useful.
“An accurate power model should have no more than a 5% error,” said Kaiser. “That way it can be used to speed up the debug of power management software.”
Power continuity
Another reason for automating power goes well beyond just the technical capabilities of the tools. It has to do with the way designs are created. Designs have become so complex that even the best and brightest engineers can no longer comprehend the whole design.
“What this means is that you may have an issue in power and not even know about it,” said Qi Wang, technical marketing group director for low power and mixed signal at Cadence. “Verification will become a very big challenge in the future. We’re used to doing functional verification. But power verification to measure power consumption needs to be considered, as well.”
In addition, he said that each step along the way of a design, starting with placement, clock tree and routing, need to be optimized for power. That, in turn, needs to be reflected back into other models that have been developed because the changes can affect all parts of the design.
Leave a Reply