2016 And Beyond

Why the golden age of low power has just begun.

popularity

Greek mythology and Roman history are replete with soothsayers, some of whom got it right and others wrong. Cassandra was cursed that her predictions wouldn’t be believed, even though she predicted the Trojan horse. Caesar’s soothsayer predicted the demise of Julius Caesar during the Ides of March, which Caesar himself was skeptical about, but indeed he was murdered before the Ides passed. However, not all predictions have to be dire and good predictions have as much of a probability of coming true as bad ones. So here are my predictions for low power for 2016:

Mobile semiconductor design will further push the envelope of low-power design. Of course, you knew that, but what is shifting is the increasing emphasis on thermal analysis and signoff. Having multiple processor cores on a SoC is no longer a guarantee for throughput improvement because of the nasty relationship between leakage power and temperature. As the devices run faster, they heat up, and when this heat is not efficiently dissipated, the temperature rises rapidly, which in turn increases the leakage power. It doesn’t stop there because the leakage power further increases the temperature. This vicious cycle, if not contained, can lead to thermal runaway where system designers must throttle the CPU speed or risk damage to the entire system.
Near- and sub-threshold designs are starting to be commercialized. That is, semiconductor designers are cleverly designing CMOS circuits to operate not at the saturation point of the I-V curve, but rather in the knee of this curve. IoT applications, such as sensor arrays and MCUs, which count every micro-ampere of current consumed to extend the battery life of systems like wearable electronics, are driving this trend. Leakage power is a major concern because CMOS devices are never really off in this region of operation.
If you thought only leakage power was important, think twice. Innovations in semiconductor device technology in the deep nanometer nodes have made a big impact in curbing leakage power. Dynamic power, on the other hand, is very design-dependent. If you have a lot of switching activity in your design, and the switching components consume a lot of energy to change values, your dynamic power quickly becomes the dominant component of power. Data-path-oriented designs such as high-end graphics and DSP applications tend to have a lot of computations. With communications and computing merging, semiconductor companies are moving toward integrating CPUs, GPUs, DSPs and other data-path blocks in the same design, making dynamic power the new bugbear.
Emerging IoT applications and the rapid expansion of the role of electronics in the automotive market are accelerating the pace of low-power innovation. Mixed-signal designs are becoming the de facto standard for low-power designs—both in their digital and analog parts. The associated trend is for the mixed-signal designs to have increasing digital content and complexity. These designs are moving to 65nm and 45nm process nodes, where leakage power becomes more dominant due to lower operating voltages.
When you are focused on optimizing each microwatt of power, accurate power estimation becomes the name of the game. Having a way to do this early is equally important, but not having the physical information at RTL has traditionally resulted in an accuracy versus a need-to-know early tradeoff. With the advent of new products in the EDA industry that comprehend power, performance (timing), chip-topology information and thermal effects in the context of actual firmware/software system-level application scenarios, power estimation can achieve the trifecta goal of accuracy, fast turnaround and early warning system.
Power estimation is only the beginning, and design verification and optimization in the context of power is going mainstream. Even plugged-to-the-wall devices, which traditionally amped up the speed and didn’t focus on power are turning power-conscious. Designers are increasing the complexity of their power architectures. The more complex low-power designs have tens of power domains. Functional verification is an already intractable problem exacerbated by the presence of power domains, power gating and multiple voltages. The number of combinations of logic and power states is exploding. Emulation is increasingly gaining popularity for power verification as well as estimation.
Both leakage and dynamic power are being targeted for reduction. It is an all-out war. Synthesis and place-and-route algorithms no longer have the luxury of optimizing area or timing first and leaving power as an afterthought because achieving power targets can take multiple iterations with the danger of non-convergence. We saw many significant innovations from EDA vendors in 2015 that were keen to set this right by simultaneously considering power every step of the way.
IP has seen low-power innovations, too. Every year there are new entrants to the list of embedded CPUs, GPUs and MCUs targeted specifically for low power. In 2016, we should continue to see this trend with performance and power segmenting the product lineup of the leading vendors. New intra-chip bus fabrics optimized for power as well as DDRs with even lower power and memories (SRAMs, ROMs and non-volatile variants) are likely to make their appearance to quench the thirst for power conservation.

The 16th century French seer, Nostradamus, made many accurate prophecies. He even correctly predicted that he would die the following day. We’ll see what happens in 2016, but there’s one thing I can predict with absolute certainty—if you are involved with semiconductors and systems in any way, you will constantly think of ways to estimate, reduce and verify your design in the context of power.



Leave a Reply


(Note: This name will be displayed publicly)