Reducing Circuitry To Reduce Power

Engineering teams have developed sophisticated power management schemes, but sometimes the best approach is just to simplify the circuitry.


By Ann Steffora Mutschler

Power is at the top of the list of concerns for design teams today. Consequently, engineers are constantly looking at new techniques and architectural approaches to lower and management the power and energy consumption of their devices.

This has resulted in some incredible engineering feats, turning parts of a device on and off as needed, applying different voltages to maximize battery life, and experimenting with new concepts that maximize processing in the shortest window possible before powering down. There are heterogeneous multicore implementations, dynamic voltage and frequency scaling, and a multitude of sleep and wake states to keep the majority of silicon dark most of the time. But even when this technology does get used, it still isn’t enough.

That has raised questions about just what needs to be done to take energy efficiency to the next level, and it has generated a lot of discussion throughout the design industry.

“If you really want to get power down you have to look at architectures,” said Navraj Nandra, senior director of product marketing for analog and mixed signal IP in the solutions group at Synopsys. “You have to look at the architectures of your black boxes and you have to look at how you architect your SoC and floorplanning. That does have an impact on power consumption. But the aspect that’s very difficult and is not covered in EDA flows and EDA tools is they don’t substitute the brain of the designer. No matter how good your flow or your integrator is, at the end of the day it’s really a design engineer that has to come up with a new architecture to really fundamentally get the power consumption down.”

That means cutting out parts of the design as well as adding in new technologies, according to Peter Suaris, senior director of engineering at Atrenta. “If you take any design there’s a lot of redundancy, there’s a lot of stuff that we do that we don’t have to do. I think of a register and a clock—that’s unnecessary. There’s a lot of work that can be done actually to only compute what you need. If you look at today’s designs, 80% is stuff you don’t have to do.”

There are good reasons why some of this technology is added into designs in the first place. Not all of it makes good business sense, though. Qi Wang, technical marketing group director, solutions marketing, for the low power and mixed signal group at Cadence said, “You may have a very good technological reason, but it may not be a very good economic or business reason. People always try to create a new problem, try to find a new market.”

Less is more

IP developers have discovered, because they’re building blocks that get repeated many times over at technology nodes, that more isn’t always better.

“Our latest 28nm designs have less stuff in them than what they did at 130nm because we’ve seen this in production over 10 years and we’ve realized that level shifters, registers, clocks—all of this was superfluous,” said Synopsys’ Nandra. “At the time when we did the 130nm design, the engineer thought we’d absolutely need this because it’s some kind of redundancy or fail-safe mechanism, but we realized over time that you can actually simplify the circuitry that works at the IP level. It doesn’t work at the SoC level because you want to add stuff.”

The argument against this approach is that if it isn’t broken, why fix it? “We have that discussion with customers who say, ‘This one’s been running in volume production in every product you can possibly imagine in 130nm and you’re trying to sell me this in 40nm or 28nm. It’s smaller and it’s lower in power. We want this [old] one but we want it re-targeted for 28.’ But in new markets where there is no legacy, they’ll go for [the more power-efficient one],” he said.

Still, the way to reduce power is to remove stuff. “You keep removing stuff until the circuit stops working and then you add something to make it work. That’s your lowest power,” Nandra said.

Another strategy in terms of simplifying, according to Cadence’s Wang, is to simply replace analog with digital. “That’s a huge power savings because analog was not power-saving transistors.”

This happens on a regular basis because engineering teams want to be able to extend the battery life of their mobile device. While foundries have addressed this by innovating with silicon, battery technology has slogged along. The only other variable is on the SoC design side.

“You can affect power consumption through some technologies and methodologies, but at the end of the day, to fundamentally get power down of not only the black boxes but of the SoC as a whole entity, you have to think about how to design in the lower supply voltages,” Nandra said. “You have to figure out how to design low-power transmitters because these things are the ones that consume the most power on the radio, on the I/Os of your high-speed interface chip. You’ve got to figure out how to get into the different power modes through all these common interfaces, and there are very interesting influences on the actual circuits when you get into these different modes. PLLs and data converters stop working—and these are fundamental building blocks.”

Leave a Reply

(Note: This name will be displayed publicly)