Taking Energy Into Account

In the IC design flow, energy is just as important to consider as power.

popularity

Considering power throughout the SoC design flow is common practice. The same cannot be said for energy, although that is beginning to change as chips increasingly incorporate heterogeneous processing elements.

Combined with this, AI/ML/DL technologies increasingly allow engineering teams to explore and optimize design data for more targeted and efficient systems. But this approach also requires something of a mindset change, and will take time to embrace energy in addition to power.

“Whenever you measure something, and then optimize it, that’s going to change what your end product is,” said Rob Knoth, a product management director in Cadence’s Digital & Signoff Group. “We saw this at the beginning of the CPU megahertz/gigahertz race. People were looking at timing-driven place-and-route, and rather than just optimizing wirelength for placement they started to actually have SDCs and static timing, and a whole world and an industry blossomed because of that. As a result, we gained much faster but more functional microprocessors. Then things started to shift and mobile started to take off, and all of a sudden faster wasn’t always better. The idea of power came in, we started to look at how to measure power, how to optimize for power, etc., and we’re essentially in the middle of that now where power is becoming much more pervasive.”

That’s about to change. “Energy is the next big wave, where all of us as a community are going to go, because energy is actually much more tightly coupled to battery life than power is,” Knoth said. “Power is an interesting, instantaneous metric, but energy is actually getting something done. It’s work. It’s how many times you refresh that webpage. How many times are you refactoring that algorithm? How many times are you processing a CT scan? It’s actual work that matters. That is happening in the device, and that work can be done in many different ways over different timescales, and different instantaneous energies. Suddenly, this is a very interesting variable. ‘If I do that same job, but I do it half as fast and burn one-third the power, I’m winning.’ That’s an obvious thing to say, but it’s incredibly difficult when you start thinking about how to effectively measure and optimize for that.”

Energy in SoCs
For years, the trend has been to refer to silicon designs as systems-on-a-chip (SoCs), particularly those that target custom applications for end-product differentiation. However, this term neglects the fact that SoCs are often integrated into an embedded system, complete with embedded firmware, said Chris Giles, product marketing manager at Mentor, a Siemens Business. “The SoC cannot be developed and optimized successfully alone and still ensure a successful end-user experience.”

From a big-picture standpoint, the challenge in bridging the SoC-to-end-user gap lies in the interaction between the SoC and the embedded firmware that manages much of the operation.

“System architecture will move many functions of the system into firmware running on the hardware, while saving hardware-native operations for the most challenging power/performance/area constraints,” Giles said. “Without a knowledge of the workflow that the firmware will impose on the SoC, designers are left to optimize power consumption based on assumptions and worst-case analyses.”

Tools exist to take activity data, based on simulations, and provide power estimates for that activity to help guide design teams in fine-tuning their designs from RTL to mechanicals to optimize power consumption, he said. Doing this while running enough of these sample workloads allows the team to begin optimizing the energy use of the system aligned with the intended workloads of the end-user.

Giles cautioned, however, that this optimization is only as good as the quality of the inputs to the process. “Firmware and hardware are often developed separately, meaning that chip architecture and RTL is often cast without real firmware, and chip verification teams must create, at best representative, stimuli to replicate workloads. In other words, without real firmware-based activity based on real workloads, the team still may end up missing the mark by focusing on synthetic scenarios. For this reason, the system most tuned for optimal energy consumption for the end-user will have been architected and optimized with hardware and firmware developed in parallel focused on real customer-driven workloads.”

Others agree. “Conventional design methodologies tend to lack coverage for real application scenarios such as streaming high-definition multimedia or OS boot-up,” said Annapoorna Krishnaswamy, product marketing manager in the Semiconductor Business Unit at ANSYS. “These scenarios span tens to hundreds of milliseconds in duration and are impractical to process. Critical power conditions, due to real application activity, are thus typically uncovered very late in the design flow when the chip is in the field, putting the design and schedule at risk.”

Krishnaswamy noted that recent breakthroughs in early RTL fast power profiling can run several orders of magnitude faster than traditional interval-based power analysis methodologies. By profiling power at the RTL stage for real application-level stimuli, designers can identify power- and thermal-critical windows early in the design flow. RTL power profiling can process very long activity files — hundreds of gigabytes in size, constituting hundreds of milliseconds of activity — in a few hours, as opposed to days or even weeks with standard approaches. That makes analysis of such large activity sets possible.

“This data then can be used for key design decisions, such as the physical implementation of the SoC, software stack optimization, and cooling requirements for the entire system for various workloads,” she said. “Early thermal profiling also can maximize coverage for system-level design by enabling the simulation of various chip thermal models that capture different operating scenarios in a variety of sequences, thus reducing the need for design margins and costly design iterations.”

All that said, one approach to dealing with energy in the design flow is modeling, Cadence’s Knoth said. “When you start thinking about the work that a system is doing, there’s a functional nature that static and vectorless methods can’t capture. So there’s a synergy between standard digital implementation and sign-off tools with functional verification, because you have to look at workloads. This will more naturally lend itself to system-level optimization, where you now have to make a model for these smaller pieces, because they’re all working together to get this job done. If you analyze it at too granular of a level, then energy doesn’t really make a whole lot of sense if you’re just looking at one inverter. But if you’re looking at it in terms of a higher level macro — it’s not even for an adder or a multiplier, it has to be something more like a filter, or CPU or GPU or network processor — all of a sudden at that level you’re going to get a better notion for if you run the system clock at this rate, if you have this much parallelism, if you use this much pipeline, then you’re going to start to be able to understand and effectively manipulate the system with respect to energy.”

Here, modeling plays a big role because the engineering team has to decide at what level they’re going to model, how they’re going to be able to translate the physical properties of the circuit itself, to what it’s going to exhibit. That is a challenge.

In addition to modeling, another approach is a tight coupling between simulation and emulation and the analysis and the optimization world, Knoth explained. “They’re very different camps right now. They’re different engineers. They’re different departments in many EDA companies, they’re completely separate business units. Getting these camps to really start talking together, and start building systems that will more nimbly work together is key.”

The first step is working with customers to better understand the role of, for example, simulation and emulation with respect RTL power analysis or power efficiency analysis in order to enabling the handling of energy and getting, larger, more meaningful functional stimulus into analysis and optimization algorithms, he said.

Energy is also important to consider in analog design, said Benjamin Prautsch, group manager of advanced mixed-signal automation at Fraunhofer EAS. “Such items to consider in analog design include using sub-threshold or near-threshold operating points design, utilization of switched circuit design techniques, utilization of low VT and low supply, utilization of FD-SOI and static or dynamic backgate biasing, intelligent power-off approaches on component level and system level, and model-based exploration and optimization on system level.”

Looking ahead
Different engineering groups within different systems developers and semiconductor companies are at different stages on this path.

There are some markets where energy doesn’t make as much sense, but for other areas it’s essential.

“Take mobile devices versus plugged in CPUs,” said Knoth. “The plugged-in CPUs didn’t really care at first. Eventually they did, but mobile devices cared immediately. For IoT and mobile, energy is obviously going to be the very first thing that they’re going to look at because battery life is so critical. Embedded medical is another good example where battery life is absolutely critical. As such, there are markets, areas, and customers that are really trying to squeeze the joules out of their product as much as possible, and they definitely have lists and demands. There are certain companies that are a bit more sophisticated with how they deal with this, but it’s not agnostic to only those markets. There are other customer spaces where there are some very sophisticated users who are just recognizing that if you’ve got two products being compared against each other, and one of them is more energy-efficient, it’s going to drive a purchasing decision. So it’s not restricted just to IoT or mobile, this is something that can differentiate product in any segment. Definitely there are customers out there that have some very good CAD flows. This is that next frontier where they see improvement.”

Conclusion
The focus for many design teams is still on power, but awareness of energy is growing. This is particularly evident at the architecture level, and not necessarily at the level of the people implementing the architecture or in the tools used to design those devices.

“Now that it’s pushing down into the level of people who interact more with EDA, who interact more with IP, this is where there’s a broader awareness of it,” Knoth said. “It’s definitely growing.”



1 comments

Theodore Wilson says:

Right on target Rob, excellent.

Leave a Reply


(Note: This name will be displayed publicly)