Mostly Accurate Computing

Computers are being developed capable of approximate computing, but do the concepts apply to power management?


Approximate computing” is a new concept in computers, allowing them to perform calculations for certain tasks that don’t require perfect accuracy with the goal of improving efficiency and reducing energy consumption. But do the concepts apply when it comes to managing power? And is there a philosophical approach when it comes to thinking about power management?

To a large extent, that depends on how “accuracy” is defined. When it comes to power estimation, the semiconductor industry relies on characterization of the elements that EDA tools use.

“For instance, the power consumed by each and every cell is given to the tools, which has usually been done by the standard cell vendor,” said Guillaume Boillet, technical marketing manager at Atrenta. “In terms of accuracy, there is a way to have full accuracy based on this assumption that the characterization is good. This means that for a specific design we have a way to get gate-level information, to get very accurate simulation information, very accurate parasitic information, so in the end in terms of accuracy we are able to generate the reference. But as you would imagine this takes a long time. It’s quite painful for the verification team to come up with the simulation vectors, etc. At RTL we are going to have information only for a few nets in the design, and all the intelligence of the tool basically resides in the interpolation of this activity all across the design.”

In addition to the question of accuracy, there is a question of just how far the engineering team is willing to go to leverage various techniques to achieve the target power goal.

Given that power is a marketing differentiation for many companies today, they are willing to put in as much energy and effort as required to achieve a differentiated offering with their solution — even if that means lowering it another 100 micro-amps, observed Koorosh Nazifi, engineering group director for low power and mixed signal initiatives at Cadence.

“They do this to either meet the target or beat their target so that when they bring their product to market it has a higher value associated with it, Nazifi said. “As a result, given the importance of power when it comes to differentiation from a marketing and business perspective, they are willing to go as far as is necessary. They are willing to pressure their IP vendors into as much as possible to extract additional power for that particular circuit, and it is not limited to dynamic power. In fact, more so it is leakage power that is of importance.”

In other words, power has transformed from just a characterization to a design metric that designers need to follow. Power must be part of the design process from a very early stage. If there is a philosophical element to this it is how early power can be estimated, noted Arvind Shanmugavel, director of application engineering at ANSYS-Apache.

“The first level of power estimation is what we call high-level synthesis-type of power estimation, where we get good-enough power models to make very large tradeoffs at the very high level. Then we come to the architectural stage – this is really where designers are implementing the architecture and where you have your biggest bang for the buck in terms of power reduction and power optimization. The architectural implementation stage is where you can manage power the best. The moment you go into gate-level power it becomes more of an estimation rather than managing power. And then there is the physical level where you have a power requirement for signoff rather than managing power.”

Technically speaking, explained Mary Ann White, product marketing director for the Galaxy Implementation Platform at Synopsys, “from a standard cell perspective, the way things typically work is that you get the SPICE models from the foundries and those are constantly being updated based on what’s going on with their processes. These SPICE models come and then you’ve got to take the GDS data – regardless if it’s from a chip or from a standard cell – and you basically do some characterization of your standard cell libraries and that’s what results in the power models.”

Recalling her days at Virage Logic, she pointed out the importance of test chips that get run through on shuttles in the different foundries. When the silicon comes back, correlation is done to compare the SPICE models in the characterization to what the actual silicon did.

This is done to understand the correlation to the models so they can be tweaked to be what the silicon should be so when the design is created, and all of this power model information comes through. Typical correlation ranges between 10% to 20%, and could go as high as 30%, White said.

The downside of inaccurate power estimation
Back on the business side of things, when a marketing department makes a commitment for a given product to their customers, that commitment is based on some target performance and target power consumption, but their ability to measure whether or not they’re going to break that target typically is determined very late in the design flow, Nazifi said.

“They go in and they make a certain commitment: ‘We think we can do this based on historical information and the fact that we’ve done this work before.’ They come up with some approximation, they make that commitment and then they go through the implementation and find that they are short because some things were not factored in or whatever the case may be,” he said.

At that stage again there are only a few options, one of which is to determine if the customer is willing to take the particular chip or IP based on the new power consumption. In most cases, if the time-to-market is not as much of a factor, chances are good that they are going to get pressured to go back and work on it a little harder.

As such, Nazifi concluded, “When it comes to ‘how close is close enough’ when managing the power, actually it has to be extremely close. The closeness is determined by the target that you have defined so it’s got to be right on the money otherwise you’re going to be squeezed or you miss the window of opportunity.”

Leave a Reply

(Note: This name will be displayed publicly)