LTE Heightens Power-Consumption Concerns

Interface promises fast access to streaming media, but complexity of designs increases with the need to solve power issues.


By Ellen Konieczny
The air interface dubbed Long Term Evolution (LTE) hails the coming of fourth-generation (4G) cellular communications, which will benefit from both increased capacity and speed. Among the lofty goals of 4G technology is the promise of users being able to widely access streaming media, such as mobile television and video, in real time. Before such capabilities can be made available, however, designers on both the infrastructure and handset sides must overcome a number of steep hurdles related to power consumption.

Compared to third-generation (3G) communications, 4G networks will demand more power. Base stations must therefore provide higher power while limiting what they consume (See figure 1). For their part, handsets will struggle against shorter battery life. Already, smartphones tend to consume more power than traditional mobile phones. The added features and capabilities of the next generation will only add to this problem.

Fig. 1:The 9341 remote radio head (RRH) promises to reduce power consumption by as much as 50 percent. Mounting these RRHs close to the antenna virtually eliminates feeder losses, which lowers the RF power required.

Fig. 1: The 9341 remote radio head (RRH) promises to reduce power consumption by as much as 50 percent. Mounting these RRHs close to the antenna virtually eliminates feeder losses, which lowers the RF power required.

“The main challenge posed by LTE for 4G terminals is in designing an extremely powerful and cost-effective communications IC that meets the stringent power constraints of battery-operated devices,” said Eyal Bergman, director of product marketing at CEVA. “This means an efficient 4G architecture has to be built using extremely efficient power-management techniques while addressing a dramatic increase in processing requirements.”

Bergman points out that LTE algorithms are highly parallel and demanding. With the use of the single-instruction, multiple-data (SIMD) architecture, for example, the same logic can be used for wide data elements with higher processing capabilities and lower power. One type of SIMD architecture that’s becoming increasingly popular for LTE is vector processors, which enable a deep-level SIMD for advanced communications. When used in conjunction with SIMD, the very-long-instruction-word (VLIW) architecture enables good isolation of parallel operations and higher power vs. performance efficiency. The different VLIW execution slots consume power only when they operate. Lastly, modern communication processors may incorporate a power-management unit to allow dynamic control of both voltage and clocks to minimize the power per use case as well as during standby.

LTE handsets also will reign in power consumption by using the Orthogonal Frequency Division Multiple Access (OFDMA) modulation scheme. As noted by Olivier Gueret, LTE Product Marketing Manager at Alcatel-Lucent, “OFDMA is a high-order modulation technique. It maximizes the number of transmitted bits per hertz, therefore allowing improved spectral efficiency. While the downlink uses plain OFDMA, the uplink uses a variant known as Single Code—Frequency Division Multiple Access (SC-FDMA). This technique has a low peak-to-average power ratio (PAPR), which translates into better power efficiency and thus, lower power consumption. This guarantees battery saving on LTE terminals.”

On the network infrastructure side, the advanced signal-processing features of LTE base stations also help to improve the quality of experience without additional transmitted power on the terminal side. For example, the Inter Cell Interference Coordination (ICIC) feature reduces interference at cell edge through smart-spectrum fractional reuse. LTE infrastructure has captured the most attention for its deployment of advanced spatial multiplexing with highly demanding multi-antenna schemes, such as multiple-input multiple-output (MIMO) 4×2 and 4×4. Such schemes promise to deliver major improvements in cell center and edge data rates. Yet their effect on the base station’s power consumption is still uncertain.

Although they concentrated on 2×2 MIMO, a group of researchers at the University of Bristol attempted to show that an efficient exploitation of multiple antenna techniques and multiuser diversity in the time, frequency, and space domains can significantly ease the power requirements of a base station. Their paper, titled “Power Efficient MIMO Techniques for 3GPP LTE and Beyond (2009),” examines the capabilities of various multiple antenna transmission and precoding techniques in combination with multiuser diversity. The researchers’ goal was to reduce the total power consumption needed for wireless system operation while improving energy efficiency. They examined a number of MIMO precoding techniques, which can be potentially applied to LTE, in terms of their combined spectral- and power-saving efficiency. The researchers found that all MIMO schemes benefit from multiuser diversity and show improved power efficiency as the number of users increases.

DSPs Serve Handsets And Infrastructure
As both the handsets and infrastructure equipment need to support uplink and downlink chains of the same standard, a lot of commonality exists between them. The similarities are typically in the basic algorithms and tasks. Of course, the network infrastructure must handle an added layer of complexity due to the need to support large numbers of users concurrently. Despite this difference, CEVA’s Bergman asserts that advanced digital signal processors (DSPs), if designed right, can be scaled from the terminal applications to also support the infrastructure side with the following main modifications:

  1. • The use of multiple processors (multicore) vs. one to a few processors in the terminal side, which will allow a large number of users to be served in parallel, and
  2. • The need to support transformations like FFT/IFFT/DFT/IDFT that can be supported on the processors themselves (if can do it with software) or by integrating additional logic.

Already, a number of high-performance, ultra-low-power DSPs have been designed and optimized for advanced wireless-communications processing in mobile handsets, as well as wireless-infrastructure applications. In fact, recent DSP releases claimed to be up to four times more power-efficient than general-purpose DSPs. To minimize power consumption they use features like power scaling to automatically scale speed and voltage for various units within the processor.

For both LTE handsets and infrastructure, low power consumption is being built into roadmaps for LTE’s future. Down the road, for example, Gueret notes that the LTE-Advanced uplink collaborative multipoint transmission (COMP) combines the signals received by several base stations to improve the quality on the uplink. In addition, LTE base stations should soon be able to dynamically adapt their emitted power to the traffic. This feature is part of the self-organizing-network (SON) initiative at the 3rd Generation Partnership Project or 3GPP.

Next year is slated to bring the next LTE protocol. Dubbed LTE Advanced, it is an order of magnitude more computationally intensive than the current version, says Mike Thompson, director of product marketing for processor solutions at Virage Logic. Thompson emphasizes that LTE is already very compute-intensive in its current state, making it hard to make cost-efficient implementations that comply with the protocol and have low power consumption. Obviously, every technology carries its own set of limitations. Yet not all of them are making such grand promises as LTE. If this small sampling of innovations represents the work going on around the globe, however, solutions will be developed in time to reign in power consumption from the base station to the handset.

Leave a Reply

(Note: This name will be displayed publicly)