Near-Threshold Computing

Operating near the level when transistors switch on an off can save lots of power, but it isn’t simple stuff.

popularity

By Bhanu Kapoor
There were two main contributing factors to power becoming a big problem (“The Power Wall”) starting around the 65nm process technology. First, the fast-growing leakage component became as significant as the dynamic power. Second, the scaling of the supply voltage stopped around 1.1 volts.

Process technology advances such as HKMG and 3D tri-gate transistors have enabled continued scaling and deal with the leakage power more effectively, but the dynamic power continues to suffer due to essentially constant supply over the process technology generations from 65nm to the most recent 22nm technology being used by Intel. To make things worse, dynamic power has a quadratic dependence on the supply voltage. Some relief comes from use of techniques such as DVFS, which let you operate at 20% to 30% lower voltage as needed.

At this year’s ISSCC in February, Intel presented several papers talking about chips operating at near-threshold voltage. The threshold voltage is the voltage level at which transistors switch on and off. If we can switch bits effectively near this voltage then the operation can be a lot more efficient compared to switching the bits with much larger swings. In fact, power savings of up to an order of magnitude are possible.

But there are some significant challenges in realizing near-threshold operations. The first one is a significant loss of performance, by a factor similar to or worse than power gains. The industry already has taken a parallelism route to improve performance, and this would suggest even more aggressive parallelism. Innovations in process technology, circuit design techniques, or a combination of the two may help improve performance enough that this becomes an attractive choice for the power-sensitive markets first.

There is also an issue of higher variability when operating near threshold. This is an issue of getting better control over your process technology. Techniques such adaptive body biasing can help manage leakage issues related variability.

Operating near threshold also results in a significant increase in functional failures when operating near threshold, especially for memory elements. This requires more complex redundancy techniques and novel circuit techniques to address the issues.

Despite some of these challenges, an attractive point about near-threshold computing is that it allows chip power and performance to scale across a much wider range of operation. This flexibility will go a long way in optimizing overall system power needs, not just for the power-sensitive markets of handheld devices but also for the leading-edge computing market.

—Bhanu Kapoor is the president of Mimasic, a low-power consultancy.



Leave a Reply


(Note: This name will be displayed publicly)