中文 English

Batteries Take Center Stage

No matter how efficient a system design, it only works when there is enough battery power.


For any mobile electronic device, the biggest limiting factors are the size, age, type, and utilization of the batteries.

Battery technology is improving on multiple fronts. The batteries themselves are becoming more efficient. They are storing more energy per unit of area, and work is underway to provide faster charging and to increase the percentage of that energy that can be used, as well as the density of the batteries themselves. But that still may not be enough, given the increasing number of features being added to devices, as well as the growing number of devices that require batteries as their main power source, many of which have always-on features constantly draining those batteries.

“Because of the pervasiveness of mobile computing and mobile electronics, the criticality of it has just been raised tremendously,” said Rob Knoth, product management group director, Digital & Signoff Group at Cadence. “If you think about some of the early electronic system designs, the worst thing that would happen when your battery ran out was that you had to change the AAs in your Walkman. Today, it’s so much bigger than that. Think about your car not being able to go down the road or your navigation system goes dark and you don’t know where you are or how to get back. Or, if you think about electronics that are placed in remote areas, you’d need an expedition to get to in order to change the battery in remote monitoring stations. So there are more applications, and they are much more important today.”

All of this is closely intertwined. So while batteries are improving, the chips are becoming more efficient, too, with much more complex partitioning and prioritization of various functions to limit how much energy they require and waste.

“In more technical challenges for next generation systems, most of the battery-powered design issues have been the same for a long time — consumers want smaller, more intelligent solutions with longer battery life,” said Fionn Sheerin, principal engineer for Analog Power and Interface Division at Microchip. “To meet those requirements, end products need to include power path management to switch between charging and discharging the batteries. They need to get the most efficiency possible from the power in the battery. They need to keep track of the available power in the battery. And they need to regulate the output of that battery to an acceptable supply for whatever is connected to the battery output.”

Battery technology isn’t standing still, either. “In recent years, there has been additional interest in exotic charge profiles for novel battery chemistries, cell balancing for increasingly large battery banks, and possibly recycling of used battery cells (with diminished capacities) into new applications,” Sheerin said. “In addition, there is also still a portable consumer application space looking for the smallest possible pocket-size solutions. These needs translate into a variety of silicon requirements — increasing pressure to do more computation with less power, to monitor power supply telemetry (voltage and current input or output) much more closely, to create configurable battery charging systems, and to do it all in less space, with less weight, and more reliability.”

That’s a tall order, considering the amount of energy batteries can story has increased an estimated 5% per year.

“Today for designers, every picowatt matters, every picojoule matters tremendously,” Knoth said. And earlier design action is key to getting a bigger payout in energy efficiency or savings. “Whether it’s from a software perspective or a hardware perspective, getting information earlier and having it be more accurate, means bigger gains overall. If you don’t get that information until late in the design cycle, which has been the traditional approach — if you wait until too late to get the data, your ability to make a big change, a big impact, is very limited. But the more that we can predict, and the more that we can give you insight into what that end energy efficiency is going to look like, the more you can make an architectural-level change, which is where you get the 25% gain as opposed to the 5% gain.”

The economics of battery technology are changing, as well. According to the U.S. Department of Energy, lithium-ion battery pack prices for electric vehicles declined 87% between 2008 and 2021. It now costs an estimated $157 per kilowatt hour, versus $1,237/kWh in 2008.

Fig. 1: Declining EV battery pack prices. Source: U.S. Dept. of Energy

Fig. 1: Declining EV battery pack prices. Source: U.S. Dept. of Energy

This cost decrease is largely a function of steadily rising demand over time, and the rush to satisfy that demand with more predictable supplies, increased manufacturing capacity, and mature processes — all of which are essential for achieving economies of scale.

Demand for better batteries is closely tied to the amount of computing that has shifted to the edge of the network. It takes too much bandwidth and energy to send everything back and forth to the cloud, and it takes too long to get results. The solution is to move processing closer to the source, which in turn increases the amount of processing that needs to be powered with batteries.

More energy efficiency
Any device connected to a battery needs to maximize battery life. Because there is only so much extra energy that can be drawn out of a battery, the next step is to maximize the efficiency of the electronics themselves. That challenge is the same for a mobile phone, an IoT device, or a notebook computer, and it begins with quantifying how much energy a particular feature or function uses.

“There is definitely a need today for the design to measure the energy consumption of your chip in real time,” said Pierre-Xavier Thomas, technical strategic marketing group director for Cadence’s IP Group. “You need to understand what are the different power consumption states so that you can extend the user experience.”

That can help determine which applications should or should not be run at any point in time, based on how much charge is left in battery. But this kind of granularity is critical for extending battery cycles, and the more that design teams understand the impact of various design decisions on energy efficiency and power consumption, the longer the battery will last.

Qazi Faheem Ahmed, principal product manager at Siemens EDA, noted this should begin at the RTL level. “That’s where the most bang for the bang for the buck is. That includes efficient micro-architectures that are more energy-efficient, playing around with different kinds of frequencies, having multiple clock domains, and bringing some of the stuff on lower or higher frequencies for the same performance point.”

But all of that is just the starting point. If a product warrants the investment, energy consumption can be reduced significantly throughout the design process.

“Let’s say we want to be 2% or 5% more energy efficient,” said Ahmed. “This might improve power efficiency at the IP level or at the block level. To really result in energy efficiency at the design level at the system level, how do you do that? One way is to compute the energy and see the power and energy intended. And potentially at larger scales with complete SoCs, at the IP level, you can do similar things.”

Innovation all around
Much of this is being driven by new applications, such as AR/VR glasses and automotive electronics, where battery life is critical and where inefficiency often results in excess heat.

“Power modules provide a power conversion with an integrated magnetic component, which can perform a point-of-load power conversion for a specific load, allowing more efficient higher voltage power transmission within an application,” Sheerin said. “Integrated boost converters are a common addition to many IoT applications running from lithium-ion coin cells, as they can take the battery output voltage (which will vary from 3.2 to 2.6V depending on the battery charge level) and force it to a standard, constant voltage to reliably run the application (possibly 3.3V or 5V). In addition, power monitoring devices and high accuracy ADCs can provide precise battery fuel gauging. For higher-power systems, silicon carbide (SiC) power transistors and diodes can help maintain power efficiency when charging or discharging large battery banks.”

Moving up the stack, co-designing software and hardware can significantly improve battery life, as well as performance. Apple’s M1 processor, which is based on Arm processor cores, is a case in point. Notebook computers containing the new chip, which was developed specifically for MacOS, more than doubled battery life compared with previous models.

“This link from software into hardware is one of the biggest opportunities we see because we live in a world where software is going to be where a lot of the innovations happen in terms of new capabilities, innovation and so forth,” said Piyush Sancheti, senior director of marketing, Synopsys Design Group. “Power is one of the most interesting domains from a chip design or SoC design perspective, because it touches all aspects of your design flow, and it has the ability to extend out into software.”

Godwin Maben, Synopsys fellow and low power architect, agreed, pointing to a balance between shutting circuits down and rapidly powering them up. Both can impact battery life, something that is critical in devices such as smart phones, where the battery size is fixed.

“I need to turn off every aspect of my chip to conserve power,” said Maben. “But the problem is if everything is shut down, if somebody calls you on the phone, it needs to wake up fast. The wakeup time is critical. If it takes longer to wake up to my call, I’ll miss it. So designers have to manage shutdown time and wakeup time. The key thing is power integrity, which looks at how good is my power delivery network (PDN). There are also new technologies they are using. For example, in the latest phones, people do new state verification. State verification means to determine by a class a system’s power state. Historically, it can be on or off, which is what has been done for ages. Now, they want to do partially on, partially off. They want to introduce for state logic. What this means is depending upon the wakeup time requirement, I can go to partially on because I can have the phone wake up faster. All the tools are now moving toward, ‘How do I support state verification,’ because of the way we are trying to reduce the idle time.”

There are multiple drivers for growth in the area involving improved battery technology and more efficient use of energy to extend those batteries. All of this is essential as more devices are untethered from plugs and new battery-only devices roll out.

“Automotive applications, and particularly electric vehicles, are the largest growth driver of battery applications today,” said Microchip’s Sheerin. “IoT, renewable energy, and datacenter applications also are contributing to the proliferation of battery systems.”

What’s changed is how much is expected out of these devices that are disconnected for hours, or in some cases years. Batteries are now an integral part of any design decisions, and they are forcing big changes in the designs for chips and electronic systems that depend on them.

Leave a Reply

(Note: This name will be displayed publicly)