5 Ways To Cut Power

Ideas that have been in development for the past decade are being implemented into advanced IC designs—and there are more on the horizon.

popularity

By Ed Sperling
Low energy consumption with minimal leakage has emerged as the most competitive element in an IC design, regardless of whether it involves a plug, a battery, or whether it’s powered by a gasoline engine.

While components on an SoC aren’t always power-aware, they’ll have to be in the future as consumers focus first on energy efficiency. With rising fuel costs, a concern over global warming and a steady reminder that smart phones have to be plugged in every night, car companies are shifting their strategy from efficient hybrids to even more efficient plug-in hybrids and electric vehicles, and California has gone so far as to mandate that one-third of all electricity sold in the state by the end of 2020 must come from renewable sources.

This shift in public awareness hasn’t been lost on the chip industry, which has been rolling out some very complex advances well ahead of schedule. Here are some of the most important:

Clouds
The push toward a cloud-based infrastructure is a way of centralizing computing—basically a return to the time-sharing model once perfected by the mainframe and then re-distributed with the advent of the commodity PC server. The data processing world is re-aggregating, but this time with a difference. It’s not just that the computing is being centralized. It’s that the centralization is taking place in proximity of cheap power sources such as hydroelectric power, nuclear plants (for now) and wind farms.

“Cloud leads to big efficiency gains,” said Chris Rowen, chief technology officer at Tensilica. “Now you can put the computing farm where the energy is available. It’s an arbitrage opportunity. It’s not hard to ship bits when you compare that to the difficulty in transporting electricity.”

There’s a clear business case to be made on this front. An estimated 6.5% of electricity is lost in transmission, according to the U.S. Energy Information Administration. That may not seem like a lot until you consider those are high-voltage transmission lines. Bits are cheap, in comparison—even trillions of them—which is why there is talk now of centralizing portions of even base stations. Those parts that do intensive computation with a high degree of redundancy are prime candidates for being located in a data center.

“There’s a lot of computation needed to reduce noise and create a clean signal,” said Rowen. “But there’s also some computing that has to be done locally because there are tough latency requirements.”

Adaptive Body Biasing
Adaptive body biasing has been under serious discussion for the past five years as a way of reducing current leakage by controlling a device’s body voltage, which in turn increases the voltage threshold. The big advantage here is less switching to the off state. The downside is this is has been difficult stuff to design and manufacture.

“This was not seen as a mainstream approach, but now it’s showing up almost everywhere,” said Aveek Sarkar, vice president of product engineering and support at Apache Design Solutions. “This was seen as a challenging technique to implement, but now TI and Samsung are using it. If you change the body bias voltage, you impact the threshold voltage. You can increase or decrease leakage, as needed, and boost performance.”

Consultant Bhanu Kapoor, president of Mimasic, noted that for some high-performance applications the alternatives such as power gating may be impractical because it simply takes too long to turn on and off sections of a chip. In those cases, body biasing is the only choice.

Atomic-Level Changes
Another technique that has been particularly difficult to master is atomic-level control of channel doping on the manufacturing side. And while most experts don’t expect the process and manufacturing side to offer any huge gains, this one may be the exception.

Scott Thompson, chief technology officer at startup SuVolta, said that by improving the doping technique, both dynamic and static current leakage can be reduced with regular bulk CMOS.

“The problem is that the wall around the channel is leaky and it’s hard to control the shape,” said Thompson. “Strain engineering helps to control the atomic-level analysis. But there has been no other breakthrough other than changing the transistor, and we don’t see a need for that for all architectures.”

At its unveiling last week, SuVolta had lined up support from Fujitsu, Cypress, ARM and Broadcom. The company claims the technology is an alternative to FinFETs, which are more difficult to manufacture.

3D Transistors And Packaging
Nevertheless, the major foundries have committed to building FinFETs at advanced nodes. Intel’s announcement of a Tri-Gate three-dimensional transistor at 22nm has been a major topic in the semiconductor industry. The question is now that Intel has publicly committed to the technology, can it really be manufactured with sufficient yield? And can it be built effectively using the disaggregated foundry model in the near future?

These kinds of questions will remain unanswered at least for the next couple years. TSMC is planning to use FinFETs at 14nm, and GlobalFoundries has been working on the same technology. Nevertheless, the big advantage of FinFET technology is a sharp reduction in leakage while providing a significant performance boost.’

Creating stacks of die also has a huge effect on power, in part because the distances between logic and memory can be shortened significantly. A system-in-package version of stacked die, using interposer technology, is expected to begin widespread production over the next 12 to 18 months, bolstered by the new Wide I/O standard that increases the size of the pipes between logic and memory.

New Materials
Fully depleted SOI, silicon on sapphire, as well as new ways of putting them all together in stacks connected by low-cost interposers that can be made of glass have turned into major research efforts as companies seek to knock costs out of the bill of materials for new chips.

While the FD SOI has been well tested for years by the Common Platform participants, the others have only been used on a very limited basis. One approach now being considered is actually designing chips to run hotter rather than trying to keep the power down. While there are limits to this approach—no one wants to pick up a hot phone—there are times when performance is more important than heat.

Taken as a whole, all of these changes can have a significant reduction in power, particularly when coupled with efficient software code and more customized user controls—and end devices that actually use the power-saving technology that is being built into these chips.



Leave a Reply


(Note: This name will be displayed publicly)