The technology challenges are daunting, the unknowns plentiful, and they span entire ecosystems.
Cellular technology is about to take a giant leap forward, but the packaging, assembly, and testing of the chips used in 5G millimeter wave and the forthcoming 6G ecosystem will be significantly more complicated than anything used in the past.
So far, most 5G devices are still working at sub-6 GHz frequencies. A massive rollout of mmWave technology over the next few years will significantly speed up the movement of data, allowing transfer rates of up to 10 Gbps under ideal conditions. With 6G, which is expected sometime later this decade, data speeds will increase by up to two orders of magnitude compared with millimeter wave — 1 terabit per second versus 10 Gbps for 5G millimeter wave.
The fundamental challenge with both 5G mmWave and 6G are that signals are prone to interference from weather or anything solid, such as windows and walls. So while the actual range of these signals can be 1,000 feet or more, the effective range in which the signal is strong enough to be useful is somewhere between 300 and 500 feet. With 6G frequencies, that effective range is even shorter. 6G signals can travel as far as 5G, but the effective range falls off faster over shorter distances.
There are engineering workarounds to propagate these signals over longer distances, but quality of service is much more difficult to maintain than 4G LTE or sub-6 GHz 5G, particularly when one or both sides of the signal propagation or receiver is in motion. For 5G mmWave to reach its full potential, it will require an estimated 60 small cell sites per square mile in a high-density area. With 6G, that number will be higher, although how much higher isn’t clear at this point.
Still, all of this has broad implications for the target use cases. For example, mmWave and 6G may work well in a controlled environment, such as inside a stadium, a corporation, or an industrial facility. It also may be useful for fixed point-to-point communication, such as the build-out of the edge. But it would be much more susceptible to interruptions using a mobile handset in heavy traffic, and much harder to find a continuous connection in a suburban or rural area than sub-6 GHz 5G or 4G LTE.
Chip and package challenges
Chips for all of these devices are getting more complex, as well. In the past, it was common to have very different chip architectures in base stations and handheld devices. But the challenge now is to maintain a connection to a signal, which requires beamforming around objects and other potential sources of interruption. It also requires building antenna arrays into packages to reduce the distance signals need to travel, the power required to drive those signals, and the associated latency. That means the antennas need to be built into the package, and with 6G they will need to be exposed more than with 5G.
The chips themselves will require a build-up multiple very thin layers, either in the redistribution layers (RDLs), or using some laminate process. Both are prone to mechanical issues such as warpage, and it’s not clear yet which approach will become the process of choice.
“The Achilles’ heel for RDL build-up for high-speed RF is that you can’t easily stack vias, and that’s one of the highest impedance issues for the channel,” said Curtis Zwenger, vice president of R&D at Amkor Technology. “If you have to dog-bone vias in an RDL structure, it’s a pretty significant return loss problem. That’s where organics have an advantage, because you can stack vias as part of the mSAP (modified semi-additive process) buildup. With 6G, we’re working on both laminate and RDL side-by-side to see which one will win in the end. RDL has challenges because it’s such a thin buildup. It can be 35 microns total for a three or four-layer buildup, so you lose the structural integrity that a substrate usually provides. Then, it’s taken over by a molded portion of that module, and there are not a lot of knobs you can turn. But for laminate, we’ve spent a lot of time doing simulations for warpage at the second level of assembly. All the circuitry for the baseband, the up/down conversion, and analog beamforming can be in the molded structure, and then the antenna is on top. That can be like a package-on-package.”
That’s only part of the picture. “The interconnect technology is critical,” Zwenger said. “You have to match those impedance lines and make sure every antenna is receiving the same signal and doesn’t have any signal attenuation loss. A lot of work needs to be done on the front-end simulation before you can even test it empirically, so you’re relying just on your simulation to help you make these very big decisions.”
The chip industry is just beginning to understand the implications of 6G, based on the limited rollout of 5G mmWave. “We’re feeling our way out of what frequency is best for what type of connection,” said Chen Chang, senior director for strategic business development at National Instruments. “Seven or eight years ago, when 5G just started, everyone thought millimeter wave was going to take over the world and solve everyone’s problems. We’re collectively learning our lesson right now. We still need the mid-bands and low-bands to get the coverage, and those frequencies continue to improve in efficiency, throughput, and how many users they can sustain simultaneously. All those innovations by the industry basically hit the sweet spot of what consumers are willing to pay for a constant connection, but they also need gigabits of throughput at any given time, and they need a reliable and constant surplus almost anywhere they want to be. This is really starting to drive a lot of the deployment.”
This is essentially a three-tier approach, with the fastest data downloads in urban centers and stadiums, relatively fast data in suburban areas, and sub-1 GHz connectivity in rural areas. But there could be more options available in the future, as well.
“An interesting technology to keep an eye on is satellite,” said Chang. “This is essential in a remote area where you don’t have any coverage. For safety reasons an emergency connection is absolutely needed, but it doesn’t have to be constant.”
More selective use cases
Location will continue to play a role, but so will how these devices actually are used. Those use cases can vary greatly by application, by population density, and by region. For example, a Research and Markets report predicts that mmWave technology will take off first in North America and Europe, followed closely by Singapore, Japan, Taiwan, and South Korea. The research house noted that Denmark, Malta, and France already have adopted the technology.
ASE introduced 5G mmWave applications in its smart factory in 2020 for automatic inspection of production lines using AI and automated guided vehicles. Cameras were embedded into smart unmanned vehicles to enable inspection and surveillance on the production floor. The company said that coverage from a Wi-Fi base station typically is not extensive enough to cover large manufacturing areas, but with mmWave it was able to minimize latency and improve wireless connectivity so those vehicles could move seamlessly around the factory floor. The mmWave network also allows “simultaneous transmission of high-resolution images captured during the maintenance process for back-end analysis.” That, in turn, allows the company to make rapid adjustments, increasingly the efficiency of equipment maintenance.
Within any factory or industrial setting, there may be multiple approaches for moving data around, some based on speed or security, others on cost and physical constraints. “With Industrial Ethernet, you need to run a lot of cable inside your factory,” said Tim Nguyen, senior director at Renesas Electronics. “That’s why you’re see a lot of combinations in these industrial applications, with Industrial Ethernet and different protocols like EtherCAT, as well as 5G/6G. That gives you much more flexibility.”
Security concerns
But that flexibility also requires a good understanding of which data should be sent using which technology. There are security implications for each of those.
“When you look at Wi-Fi, that’s semi-promiscuous once you’re outside of four walls,” said Erik Wood, senior director of technical marketing for product-security at Infineon. “When you move to 5G and what could be 6G, now you’re just moving that radius of circles closer and further, back and forth. But as soon as it’s outside my four walls, I have to treat it the same as if the attacker was sitting in a foreign country at a computer terminal and attacking that way, because I can’t really define it once it’s outside my four walls. I can’t set a different expectation for counter-measures and technical requirements that support those counter-measures, whether it’s a mile away or 10,000 miles away.”
Others point to similar concerns. “We have so many WiFi, Bluetooth, LTE, 5G systems right now,” said Raj Jammy, chief technologist at MITRE Engenuity. “We walk around with these things in our pockets. We have them at home, in our cars — they’re all over the place. That’s another opportunity for people to send signals through those antennas and intercept some of those chips, and have them transmit things as data is moving. Instructions also are moving between those chips. So there are clear vulnerabilities out there, and each company has some idea about the vulnerabilities they have seen already. But it’s usually the unknown unknowns that we have to worry about, and we do know there are many unknown unknowns lurking out there.”
Timetables
It’s not clear exactly when 6G will roll out, but industry sources indicate it will happen sometime around 2029 or 2030. 5G mmWave is just starting to be introduced, and it is spurring everything from new materials and equipment for testing these chips and systems, to new ways to automate the design of these chips and systems. The big question now is where companies will place their bets on technologies, and whether those bets will pan out.
“The rollout of 6G is somewhere around the end of this decade,” said Niels Faché, vice president and general manager of PathWave Software solutions at Keysight Technologies. “There are lots of companies working on it. We support that early phase of the market. Sometimes when you bet on a technology, it doesn’t go well. But in this market, you definitely know who the big players are and we know how to align with them. And if you want to be in this market, you have to be in early — well before the standard is finalized. A lot of our customers already are building prototypes. And for us, we know the frequency ranges, the bandwidth, and we have enough information to start designing our instruments because those are very long development cycles. There’s no way we can sit back and wait to see how this plays out. When the market is ready to take off, we have to be ready, too.”
On the design side, the biggest challenge is that these technologies are in a constant state of change, which makes it hard to develop good models and methodologies. “The spec is often changing for 5G, and now 6G,” said Jean-Marie Brunet, vice president and general manager for hardware-assisted verification at Siemens EDA. “As soon as you change the spec, you need to simplify it and have a model so you can apply that modification to the spec and see virtually how this thing reacts. But it’s more than just a chip that needs to verified. It’s across a full ecosystem. We’re seeing a lot of activity around the automotive space, for instance. What happens if it’s raining versus not raining? If the weather changes, will the signal behave differently?”
Conclusion
There are a lot of variables involved in moving huge quantities of data, no matter which communications technology is involved, and at this point a lot of unanswered questions. Moving that data wirelessly using higher frequencies has a whole slew of challenges that need to be solved before this technology’s benefits can be be fully realized.
“With some of these advanced packages, reliability is critical,” said Amkor’s Zwenger. “In automotive, with the advent of electrification and then ultimately ADAS, they’re going to rely on these networks that are supposedly highly reliable to sustain this autonomous environment we’re expecting, as well as artificial intelligence, AR/VR, and all the high-speed cloud services. That’s going to put a big stress on the infrastructure as a whole.”
But these are only the known or suspected challenges. As with any new technology, it takes time to even comprehend what can go wrong, and with this kind of heavy usage and growing volumes of data, it’s a safe bet that there will be many other issues that crop up in the lab, the fab, and in the field for years to come.
Leave a Reply