Low-Power Design Is More Than Just Minimizing Power

Meeting a power budget and calling it a day is no longer good enough.

popularity

Engineers are accustomed to making tradeoffs when designing products — faster and more power-hungry, or slower and lower-power; expensive and durable, or cheap and disposable; and so on. The ongoing list of tradeoffs and subsequent choices that need to be made can sometimes appear quite daunting. This blog discusses how the design of electronic systems in the context of power has expanded beyond merely examining power vs. performance.

Power was once viewed as a higher-level metric, to be reduced as much as possible while still being able to meet performance specs. The idea was to simply scale down power to within the design’s allotted ‘budget’ using best practices and move on. Power consumption was viewed as a design parameter that scaled with transistor drive strengths, so the more power you applied, the faster your circuits would be, and vice versa. Obtaining maximum performance (e.g. CPU speed) within the power budget was the primary concern.

Today, with leakage power and lower power noise margins being the norm with advanced process technologies, the attention has shifted toward reliably achieving the required performance while maximizing battery life by better managing a specified power number. In other words, meeting a power budget and calling it a day is no longer good enough. Designers are interested in achieving the required performance while reducing power by looking at the use of different logic and clock gating strategies during implementation, as well as efficiently delivering power without inducing too much power noise across different customized power grid design styles and power gating strategies. Two designs with the same power consumption can have vastly different power efficiency, power distribution across the PDN, and component level activities. These factors have direct implications on battery life, reliability in terms of susceptibility to electromigration (EM) and electrostatic discharge (ESD), and poor performance manifesting as hardware or software malfunction.

Lower power hotspots, better reliability
In the past, products often were powered by an external power source (power grid) with no regard to the impact on carbon footprint or cost. The products were designed using process technologies that allowed high noise margins, beefy ESD protection devices and shielding structures (“correct by construction”), and large transistors, wire widths, and die/board area (fewer reliability and self-heat issues). However, that didn’t mean that the problems exaggerated in low-power devices today didn’t always exist. They just manifested in different ways. For example, overclocking a CPU was a popular scheme to get more performance out of desktops, but this came at a potential cost of overheating and shorter lifetime of the CPU. The fact that the end applications (e.g. desktops, TVs) were not concerned about the battery life, unlike modern hand-held and IoT devices, further masked the underlying factors.

Because we no longer have the luxury of lenient power budgets, ample noise margins, and generous layout real estate, designers trying to pack more functionality and performance with less metaphorical “leg room” need to overcome these challenges through an accurate simulation of real-world scenarios. On the chip-level, finding voltage drop hotspots, EM, and timing path hotspots through full-chip, dynamic simulations that incorporate detailed package effects is essential. FinFET-based process technologies in particular are also requiring thorough thermal-aware assessment of the chip, package, and system.

ansys1
Fig. 1: Key Challenges in Mobile /HPC Platforms

Better analysis, better power efficiency
Instead of performance being at the forefront of design considerations, the tide has shifted toward power efficiency. Fortunately, given the spotlight power analysis has received over the past decade, software tools have evolved to not only allow us to dissect and analyze exactly how a chip or system consumes power for better optimization and reduction, but also enable us to quantify the implications power consumption has on other aspects of the design such as reliability/aging, functionality, and failure. Designers might ask: How efficient is the clock gating scheme in reducing power? What implications do early block placement decisions have on how well power gets distributed from flip-chip bumps? Is performance degradation in any region of the multi-core processor due to asymmetrical heating effects on the power grid a concern? These are the types of questions that address more relevant concerns in the context of designing for power today.

Today, each design has its own set of priorities, leading to a corresponding set of tradeoffs. While mitigating aging effects on performance specs might be important for medical devices or automotive applications, simply ensuring minimal functionality over the lifetime of two AA batteries might be the main concern for a TV remote control design. With power-efficiency being a key requirement, regardless of whether a design is targeted for high performance or longer battery life, designing for power is clearly more than just about minimizing power.

ansys2
Fig. 2: Solutions for Mobile/HPC Applications



Leave a Reply


(Note: This name will be displayed publicly)