Fire In The Hole?

Power is becoming even more difficult to manage as we move to the next process nodes.


By Bhanu Kapoor
At the 2001 ISSCC, Pat Gelsinger, then Senior VP at Intel, had observed the following in connection with the growing issue of chip power density: “Ten years from now microprocessors will run at 10GHz to 30GHz and be capable of processing 1 trillion operations per second—about the same number of calculations that the world’s fastest supercomputer can perform now. Unfortunately, if nothing changes these chips will produce as much heat, for their proportional size, as a nuclear reactor. . . .”

Supply and threshold voltage scaling along with power management techniques such dynamic voltage and frequency scaling and power gating have helped manage the power density issue along with some process technology advances such as the high-k metal gate technology. The scaling of clock frequency has also been a victim in the process as is evident from the chart [1] shown below.

Fig. 1: Microprocessor clock frequency over time. Source: ISSCC 2010 Trends Report

Fig. 1: Microprocessor clock frequency over time. Source: ISSCC 2010 Trends Report

Going forward, the power picture looks even more problematic. The dependence of leakage on process variation is already causing major difficulties at the 28nm process technology node, and that’s a separate issue altogether. The scaling of supply voltage isn’t expected to keep pace with the scaling of feature sizes. According to the ITRS roadmap, as we advance from 30nm to 20nm, the supply voltage is expected to decrease from 1 volt to 0.87 volts only. The transition to 17nm sees an even smaller decrease in the supply voltage. At the same time, the threshold voltage is expected to remain nearly flat around 0.29 volts.

We are expected to see a doubling of transistor count during this time as the number of cores double. This will lead to significant increases in active leakage power while the dynamic power will also see an increase. As a result, the power density will increase at a higher rate than it has in the recent past assuming a flat frequency scenario. And if we are forced to lower the frequency to be in the power budget then even the performance benefits of parallelism for the highly parallel application scenarios may have to be questioned.

–Bhanu Kapoor is the founder and president of Mimasic, a low-power consultancy.


Leave a Reply

(Note: This name will be displayed publicly)