On, Off and Mostly Off

Design shifts to turning off ever-larger portions of chips; voltage spikes remain a hazard at startup.

popularity

<p>By Ed Sperling</p>

<p>System-on-chip architecture has always been about getting the most performance out of a device, and the basic premise is that when you turn on a device it is always on.</p>

<p>That approach has been challenged over the past few years with a fundamental shift toward more of the design being in the ‘off’ position. Aside from reversing decades of engineering practices and assumptions, that accomplishes a couple of very significant things.</p>

<p>First of all, with static leakage a persistent issue in all devices at 90nm and below, the simplest thing to do from the standpoint of the device’s power budget is to turn parts of a chip completely off.<span>  </span>That has become the norm in most designs, which is why the number of power domains is growing. Some use different voltages, some are turned off completely when not in use, and still others are reduced to various levels of standby, depending upon how quickly they need to return to a full “on” position. All of this saves battery life in handheld devices, and it saves power in large racks of servers in data centers.</p>

<p>From an architectural standpoint, the key concern has been prioritization of function and what is most important to the consumer. In a smart phone, for example, the phone must be able to receive a call at all times while data needs to be uploaded regularly but not in place of a phone call. And a camera can be switched off almost all the time. In a television or computer, almost all functions are on at all times, but in a more acceptable state. The long delay in booting up a computer from scratch or waiting for a television to warm up was considered unacceptable by consumers so a standby mode was added, basically giving priority to their time while reducing energy consumption.</p>

<p>In the future, however, more of the device will move to the off position, regardless of whether it’s a home appliance, a computer in the home or in the corporate enterprise, or a handheld device with limited battery life. Work is underway to develop intelligent devices that reside inside plugs so that once devices are fully charged they no longer draw current.</p>

<p>Ferroelectric memory (FeRAM) is another option in devices. The construction works the same way as DRAM, but it uses a ferroelectric layer rather than a dielectric one. The advantage is lower power draw, higher speed and more write-erase cycles. So far, cost has been a deterrent, but with power now at a premium in designs, experts believe there is some hope that FeRAM could grow as part of an overall low-power design.</p>

<p>The more immediate solution, however, is multiple power islands. Bhanu Kapoor, founder of Mimasic, a low-power design services company. The problem comes when you turn those islands on and off.</p>

<p>“It’s not hard to imagine a situation where you go from ‘standby’ to ‘on’ and then to a large portion of the chip being ‘on,’ said Kapoor. “That can lead to voltage spikes on the device, and it gets worse as you move to many-core computing where you have a large number of processing cores.”</p>

<p>He noted that Nvidia is developing a 512-core graphics chip that is highly parallel with cores divided into groups of 24—a many-core approach as differentiated from a multicore approach. That could create as many as 30 power islands, however, and he said each of those islands has to be sequenced to avoid huge power spikes. From a design standpoint, that is no simple task.</p>

 



Leave a Reply


(Note: This name will be displayed publicly)