Major U.S. carriers have pilot projects going, but logistics, cost, build-out pose issues for everyone else.
Carriers and chipmakers are celebrating the rollout of the first standards-compliant commercial 5G services.
“We are, officially in the era of 5G,” said John Smee, vice president of engineering at Qualcomm at the recent 5G Summit at IEEE’s International Microwave Symposium (IMS) in Boston. Movement is happening on the commercial end. Major U.S. carriers Verizon, AT&T and Sprint have set up 5G mobile network services in select cities with limited number of subscribers.
Reviewers have confirmed that, in most cases, the service is fast and fairly reliable within sharp geographical limits. South Korea, U.K., Switzerland, Spain have some commercial 5G networks up now, with China due to roll out 5G in October. Russia, Japan and many other countries have test beds and plans for 2020 commercial launches.
Yet even chipmakers and OEMs that are enthusiastic about the current state of 5G or the potential market for a product-heavy 4G replacement warn that logistical, technical and legal obstacles still need to be cleared before 5G technology can advance enough to fulfill the ambitious role standards bodies and technology providers have set for it.
“Those field trials are interesting, and I’m involved in one,” said Frank Schirrmeister, senior product management group director at Cadence Design Systems. “It costs maybe $10 more per month than the regular service. But I ask myself, ‘Is it worth $120 per year to let my daughter download a 150MB Netflix episode in less than a minute rather than 13, or do I just stream it anyway?'”
So can the carriers really make money off 5G? At this point, it’s not clear. “Even if they raise [monthly fees by] $20 bucks later, that’s not the world, but it makes you think that although the carriers are obviously excited about 5G, do they think they can make any money from it,” Schirrmeister said. “How expensive is it and what would it be for the consumer? It looks as if much of the rollout may become something very application-specific. It may not be a consumer thing. So if I wanted my industrial production line to have 5G connections for the very low latencies, that’s what it would revolve around. These custom applications may play a significant role in the rollouts, but we may not see that aspect of it ourselves.”
What is likely is that, of the ‘legs’ of the 5G triangle, “we may only see the high bandwidth, not the low latency or support for a million devices per kilometer,”Schirrmeister said. “We would only see those later.”
That plays right into the confusion over which aspect or use case people are talking about when they say 5G, said Earl Lum, principal analyst at EJL Wireless Research. “They all mean a different one. Is it the smartphone? Fixed wireless? Is it sub-6-gig? Is it millimeter wave? Is it a different use case?”
Ericsson’s June 2019 mobility report predicted mobile 5G subscriptions worldwide would reach only 10 million by the end of 2019, rising to 1.8 billion—20% of all mobile broadband subscriptions—by the end of 2024.
Carrier mobile services are more obvious examples of commercial 5G, but Qualcomm, OEMs and other providers such as Qorvo and Anokiwave—which makes mmWave front-end ICs, mmWave SI Core ICs and active antenna ASICs—have focused on IC for both 5G development and more traditional microwave customers. Quorvo’s 5G lineup includes phase shifters, front-end modules and MIMO components and has shipped over 100 million 5G wireless infrastructure devices.
Only the most basic services show up at the start, but even those allow aggregation of spectrum to give a specific application its guaranteed bandwidth in the sub-6GHz range that the mobile networks currently occupy and connect IoT or other low-power devices using 700MHz frequencies using the same physical footprint and the same network framework.
“With phased arrays and beamforming, it’s no longer an omnidirectional universe,” Smee said. “5G is still in the infancy of its commercial deployment, but this is a real pivot-point for 5G.”
Building momentum
Regardless, 5G is moving forward. “Nothing is going to stop 5G,” said Ben Thomas, director of technical marketing at Qorvo, during a recent presentation. However, he acknowledged there will be a learning curve in adapting to multifrequency networks.
“The biggest challenge to our engineering teams and across the industry is simply the higher complexity,” Thomas said. “As we move up to the millimeter wave bands like 28 and 38GHz, we recognize that it’s driven by a very different use case.” he said.
Sub-6GHz frequencies are easiest to adapt for mobile network tests, for example, but don’t have a fraction of the capacity and speed of frequencies above 28GHz. The trick is to know which applications can make the best use of mmWave connections to improve their own performance or to handle backhaul or other tasks for apps on lower frequencies.
“With millimeter wave, there are challenges—propagation and short range, but the goal is really about network capacity and high-density areas,” Thomas said. “To support these bands we have to significantly change the design and testing we do in these environments…Combine that with more carrier aggregation combinations, dual uplink, and more complex wave forms and modulation schemes, and quite frankly you’re looking at an exponential impact on the RF section.”
On the other hand, being able to do 5G commercially at all surprised some experts with decades of experience building mmWave communications systems and radar for the military.
“Two or three years ago, I would have said there is no chance,” said to Andrew Zai, senior principal engineer at Raytheon and chair of the 5G Summit at IEEE’s recent IMS in Boston.
“In MIMO the construction techniques are still very labor intensive and you need to do all sorts of tuning on the hybrid module,” Zai said. “In phased arrays, there are so many states you need to check. If you want to check the array you have to check all your different channels. That would involve testing each element individually and checking the amplitude coming out to make sure the phase shifts out every element. That’s a lot of testing.”
Coming down the pike in many 5G rollouts is the effort and cost not only to build out a network of repeaters to give the network reach, but also build out some level of edge computing equipment to hold and move the data to be at super-speed by users of the very-low-latency network applications.
“Trying to get data from wherever it is stored to the point of use is going to be a problem,” said Ajit Paranjpe, CTO at Veeco. “You don’t bet millisecond latency if you have to go to the cloud for everything, so there needs to be an infrastructure of servers or storage spread around the edge so the data can be moved ahead of time close to where it is going to be needed before it is required.”
It’s pretty obvious that the 5G networks themselves have to be built out in a hierarchy that puts the highest-powered devices the farthest away—possibly in the same locations as 4G cell towers. Millimeter waves attenuate so quickly, though, that chains of femtocells and picocells reaching from base-station access points to someplace very close to end users or subscribers have to be installed.
That will require very high density provisioning of 5G networks, however, with femtocells every 100 meters, perhaps and picocells at even shorter distances.
“There are more efficient ways to approach it, such as the WiFi mesh Google has been using with some of its products as opposed to repeaters, in which case the deployment could be very rapid with millimeter wave,” said Paranjpe. “But there is still the question of how you get that much data through the backhaul system, which would use a higher-powered device and possibly optical fiber.”
Energy use
Once in place, usage should grow rapidly.
“The good news is that it could spread very quickly to customers, as 4G did, because once it is available, everybody is going to want to have it, and it won’t take long to build into their phones,” Paranjpe said. “The bad news is that you’ll end up doing a lot of 5G downstream to your phone and 4G upstream due to capacity and energy limitations, and it could take a while to build in the features and to make anyone want to put an enterprise application on it. 5G is brand new. Consumer services are okay, but you don’t want to put something mission critical on it. So you won’t see enterprise services for a while. Right now it’s okay for consumer stuff, but it’s too new to trust for anything mission-critical.”
Case in point: “We heard a very rough estimate that the number of base stations for mmWave would be 10X compared to 4G LTE,” said Rajeev Rajan, senior director of solutions at ANSYS. “We already know almost 70% of the LTE operating cost is coming from base stations. If you have to do 10X the number of base stations, they really have to do their homework in terms of trying to optimize the components—not just for the cost, but for the energy bill, too. You have to look at everything in the signal chain to optimize the energy efficiency.”
Power increases with complexity, too. “When you design a more complex algorithm, no doubt the power it will consume is more,” said Peter Zhang, R&D manager in Synopsys’ Solutions Group. “When you have all this activities, something has to power all those things. It will take more power.”
5G draining the battery power out of a device like a smart phone is not as concerning to some. “The system can find the next level network down,” said Neill Mullinger, product manager, vertical market solutions group at Mentor. “We’ll always have the 3G, 4G option for certain places where it is really difficult for 5G and mmWave to penetrate people, trees, buildings or whatever. The other thing that is going to be an issue to deal with is the power consumption. If you’ve got 10X the data going to your device, it is going to use up 10X the power consumption. It may not be that quite linear, but it certainly is going to require a lot more in terms of being able to do power profiling of your design under a lot of different scenarios to see how that’s going to be affected.”
That’s just part of the power picture. “Even without looking at optimization of a particular chipset, about 40% of the energy in a data center is used by coolers and chillers,” said ANSYS’ Rajan. “There will be gigabytes and exabytes of rich multimedia created and consumed and transferred across those networks. The base stations are packed with phased-array antennas and FPGAs. It can be a big problem. My numbers may not be precisely right, but approximately for every 2 watts of energy you consume, you generate about 3 watts of thermal energy. So for every 50 watts of energy in a base station, you get 150 watts of thermal energy in the box. That is a lot of energy in that single box, with no cooler and no fan in there. You can end up with a design for a base station that is nothing but giant heat sink. It is a massive problem for them.”
The effort to make 5G applicable to everything from low-power IoT networking to high-profile, high-bandwidth mobile networking may keep it from being pigeonholed into specific roles. But it also makes it difficult to know where to start in determining everything from the frequencies to be used to the density of repeaters built into the network.
The culprit for many other problems—cost and layout of 5G networks and thermal issues especially—is the rapid loss of power in mmWave radio signals that limit the range and connectivity of 5G networks, said Ahmed Khalil, director of design engineering at Analog Devices. “The loss increases 20db for every decade of frequency. For millimeter wave were talking about 30GHz, and about 20db of extra loss, which you can’t afford to take lightly, especially when you go up to a frequency where the semiconductors are not as good at supplying energy as at lower frequencies. It means we won’t get 3 kilometers of coverage. It will be more like 300 meters, if that, and that affects everything else.”
The level of attenuation means designers have to design a shrinking hierarchy of smaller cells—from the base station to femto, pico, micro and macro cells—each getting one layer closer to the end customer, until the smallest might cover just one inside room.
Heat and cost
That attenuation leads to two other major problems—heat and cost. The heat builds up due to the inefficient conversion of power to RF signal in base-station power amplifiers. The cost accrues as carriers or owners of private 5G networks realize they have to buy not a few cell-network signal repeaters to get better coverage in a particular area, but enough femto-cells or picocells to place at least one access point everywhere, inside and out, every 100 to 300 feet.
Getting more gain from the base station can reduce the number of network repeaters required, but the only way to add gain is to add power and that adds twice as much heat as power.
“In a typical system you might have 64 channels as opposed to four on the RF power envelope and in the envelope of the antenna,” Lum said. “In 4G remote radio units the 4 x 40 watt range equals 160 watts, which is nominal. On a massively MIMO system it could be 160 watts, 200 watts, maybe higher. And you don’t know what contribution, if any, the beamformer makes to the heat buildup.”
Huawei and one or two other manufacturers had water-cooled base stations on exhibit at the MWC in Barcelona this spring, Lum said. Cool as it was, however, it was a bad idea. “You don’t want to put anything out there with a system that can stop working.”
Leave a Reply