Integrating Energy Efficiency Considerations Into Your Design From The Beginning

Optimizing the performance per watt of HPC SoCs starts when defining the architecture.

popularity

Data center networking is responsible for consuming about 1% of the global electricity supply. With the advent and integration of AI into various sectors, the pressure on both hardware and software infrastructures, necessitated by neural networks and extensive language models, is expected to increase significantly.

The burgeoning energy consumption by hyperscale data centers emerges as an urgent issue needing immediate attention. The challenge lies in developing power-efficient System-on-Chips (SoCs) for high-performance computing (HPC) applications without compromising their performance targets.

This article will emphasize the importance of adopting a shift-left approach, focusing on integrating energy efficiency considerations into your design from the beginning. Continue reading to discover various tools and techniques aimed at achieving low-power designs.

Optimizing performance per watt: The balance challenge

With the mainstreaming of cloud computing and AI models, real-time processing and analysis of data has become integral to more applications than we ever thought possible. Even vehicles, with their advanced driver assistance systems (ADAS), rely on these capabilities to ensure that safety-critical functions like blind-spot detection and self-braking work as and when intended. There’s no getting around the fact that larger volumes of data, not to mention larger AI models, require greater compute power.

Traditionally, HPC applications have been all about performance. However, with apprehension looming over detrimental events like grid blackouts, there’s now growing attention on performance per watt. And because of this, we’re starting to see more interest in improving energy efficiency across the board. In certain cases, performance is limited by power or energy consumption. Sometimes, systems can’t run at their target speed because they’re consuming too much power. As such, if the design’s energy consumption can be reduced, so long as the system is under its power limit, there’s an opportunity to get the system to run faster to approach that limit.

The big issue then becomes clear: you can’t really address energy efficiency at the tail end of the design process because, by then, the architecture has been defined and many design decisions have already been made. Each of these decisions along the way has an impact on power. In the physical implementation flow, it’s possible to squeeze some power consumption out of the design, but this doesn’t solve the bigger problem.

What’s needed is a whole shift-left mentality, where the design team starts by defining what the power-efficient architecture should be like. What type of IP is needed: digital signal processing (DSP) cores or hardware accelerators for specific functions? How fast should the system run? Can parts of the design be shut down when appropriate? Can clock frequency be slowed to throttle power? How should the memory subsystem be architected? What kind of process technology should the chip be designed on? These are just a few examples of key questions related to a design’s power consumption.

Many designers are now evaluating energy consumption in the context of an actual application’s workload, which is a wise approach. Analyzing the power profile can provide clues to reduce power, whether from different micro-architectural modifications, software/hardware optimization, or other methods. Fortunately, there are many tools available for this. Consider the example of AI startup SiMa.ai, which has developed a purpose-built, software-first platform that scales machine learning (ML) at the embedded edge. During SNUG Silicon Valley 2023, SiMa.ai highlighted how it used emulation-driven power profiling to optimize its design’s hardware architecture, software, and compiler to achieve a 2.5x performance/watt improvement.

In AI/ML designs and those with heavy data processing, glitch power—wasted power due to unnecessary transitions or redundant activity—can be as much as 25% of a design’s total power consumption. RTL-to-gate glitch power analysis and optimization solutions can help identify the sources of glitch power and enable designers to understand how much glitch power these sources are contributing. While AI applications are creating great demand for power, AI-driven electronic design automation (EDA) solutions can help optimize for power as well as performance and area. Down the road, perhaps AI can be applied to create more power-efficient RTL code, or to help define/refine the design’s architecture.

The blueprint for energy-efficient chip architecture

Designers have traditionally scaled to more advanced process technologies to enhance power. But now that Moore’s law is waning, attention is turning to new materials. Photonic ICs, which utilize the properties of light, have demonstrated the ability to increase bandwidth and speed while reducing power and latency. For AI chatbots and other HPC applications, photonic ICs can potentially provide a path forward. Exploration into alternative semiconductor materials such as gallium nitride and silicon carbide could yield some options as well.

Collectively, every seemingly small decision can have a profound impact on a chip’s overall power consumption. There’s plenty of room for R&D, from exploration of the materials used and design techniques to enhancements in the design and verification tools themselves. Considering energy efficiency at the very beginning of the design process is a good way to start. To help you on the path to more energy-efficient SoCs, Synopsys provides an end-to-end solution for low-power design, encompassing design, verification, and IP.

As electricity demand continues skyrocketing across the globe, we’ll need to rely on the engineering ingenuity that has brought us ChatGPT, self-driving cars, and industrial robotics to come up with more ways to decrease the power consumption of chips. Energy-efficient SoCs are becoming increasingly critical in our modern world.



Leave a Reply


(Note: This name will be displayed publicly)