Chip Challenges In The Metaverse

Thermal and performance needs will require complex architectures and technology changes, not all of which are available today.

popularity

The metaverse is pushing the limits of chip design, despite uncertainty about how much raw horsepower these devices ultimately will require to deliver an immersive blend of augmented, virtual, and mixed reality.

The big challenge in developing these systems is the ability to process mixed data types in real time while the data moves uninterrupted at lightning speed. That requires the integration of systems, as well as systems of systems. Seamless operation will require advancements at the chip level and beyond:

  • Processors must possess sufficient power to avoid jumpiness in signals. At the same time, the system must be architected in a way that does not drain batteries too quickly. Achieving that balance requires a mix of processing elements, many of which will need to be custom designed for specific tasks and data types, with prioritized data paths to move that data quickly to wherever it is needed.
  • Headsets must run cool enough to be worn next to the face for extended periods of time. A smart phone may become too warm to hold comfortably if the screen is active for an extended period of time. A device worn next to the face requires an even lower heat level than the phone, even though the amount of computing may be much higher.
  • Massive bandwidth is required to move image and video data back and forth. That generally can be achieved with a high-speed wireless connection in a fixed setting such as a home or office. But in mobile applications, these devices may be searching for signals because the higher frequency required to carry those signals is prone to interruption by everything from weather to other moving objects.
  • Architectures will need to be flexible enough to incorporate changes in communications protocols and other standards that are still evolving, which will require some level of programmability. That could impact performance if it’s not done right. And making those changes in software could slow systems considerably, require more processing, which in turn would generate more heat.

A device that meets all these requirements necessitates “an evolution in hardware and efficiency,” said Amol Borkar, director of product management, marketing, and business development for Tensilica Vision and AI DSPs at Cadence. “This space is still nascent. The battery-operated headsets today are mostly prototypes. This device will likely have the same compute capabilities as modern-day smart phones, but considering the proximity to the user’s face, the power profile will need to be a lot lower. I would assume the same with emissions.”

In fact, in some cases it may be tethered to a smart phone. “There are a lot of questions about how much you want to do on a device like glasses, versus how much you want to do on some connected element,” said Steve Woo, distinguished inventor and fellow at Rambus. “One possibility is that your phone could be a hub where some of the processing can be offloaded. It’s a bit of a better environment than something like glasses. But it’s still a battery-power device, and if you happen to be next to something that’s plugged into the wall, maybe you can offload some of the processing there, too. But response time is going to be so important that you’ve got to have everything done as close to the device as possible.”

Others cite similar constraints. “We think of the metaverse as anything that enhances mobile phones,” said Pablo Fraile, director of the Home Market Segment at Arm. “There are a number of wearable directions this technology could take. AR and VR are the extremes of that spectrum, and the constraints for each of those extremes is very different. We also think about the metaverse being deployed in different environments — home, office, outside or inside the car — so connectivity is going to be absolutely essential. And that connectivity is not just between the user and the network, but between the different devices that people are going to carry around, because you may have different wearable devices at the same time.”

Big vision, lots of chips
A common perception is that the metaverse primarily will be used for gaming or an animated 3D version of social media. But it potentially has much broader uses in industrial and corporate applications, and each of those will have very different technology requirements and chip architectures.

Dave Keller, president and CEO of TSMC North America, referred to the metaverse during a recent presentation, noting that AR already has been widely used by its customers and internally for training. “During the lockdown, we used HoloLens to work remotely with our equipment vendors in the U.S. and in Europe,” he said, adding the metaverse will take this one step further. “Last year, the metaverse had everyone buzzing about converging real and virtual into the digital era. This will require 5G conductivity, cloud, and edge computing.”

It also will require an enormous amount of power. “You will need about 1,000-fold more compute power to live in the metaverse,” said Adam White, president of Infineon‘s Power and Sensor Systems Division. He said that with AR/VR, the big question is how to generate enough power to make it all work.

“Our servers are not yet capable of handling the metaverse,” said Shawn Slusser, senior vice president of sales, marketing and distribution for Infineon Americas. “And then there is this whole sensor, IoT kind of environment that has to feed into the metaverse. It’s like an ecosystem, and that ecosystem hasn’t been built yet. It will take some time.”

Arm’s Fraile agrees. “We’ve got to go through a learning phase, both in terms of how to use these things and how to build them,” he said. “That’s not to say the challenges are way bigger than anything we’ve seen before. But this is incredibly hard technology to put together. And getting these things to synchronize and look realistic is going to be hard work. It’s not just about connectivity. It’s also about data management and data compression. All those technologies still need to emerge, to some extent, and there is a large body of research in those areas.”

Included in all of this is a huge number of semiconductors, many of which will be highly customized.

“Once this starts to build into a reasonably sized market, then you start to see more and more dedicated innovation,” said Rambus’ Woo. “You can see this with smart watches, which are now their own category and warrant their own unique development. You’ll see the same thing with AR and VR as you find the right applications and more and more people start adopting it and addressing issues. Memory will be a big challenge for that kind of implementation.”

Integrating these devices so they can smoothly exchange information adds another challenge. While initially this will build upon existing communications infrastructures and approaches, it will stress those systems in new ways as the amount of data moving across wired and wireless infrastructure balloons.

AR/VR developers have been waiting since the 28nm node for processors to deliver enough performance at low enough power to make this a possibility. What they didn’t anticipate is how dramatically compute architectures would change over that time. So instead of waiting just for 5nm or 3nm processors, the biggest players in this market are now designing their own chips to prioritize the flow of data, using heterogeneous architectures that incorporate some form of AI, much faster I/O, as well as some advanced-node compute elements.

This is essential, given the processing and resolution requirements of metaverse devices. “We’re talking about two display drivers, 4K x 4K,” said Hezi Saar, senior director of product marketing at Synopsys. “They should have a high bit rate, RGB 30 bits, 90 Hz minimum in terms of frequency. So you’re looking at about 40 or 50 gigabits per second per eye. That provides very high resolution and does not create any motion sickness or anything like that. You’ll need cameras to see the environment for you, connectivity to an XR processor, and connectivity to the display drivers, as well.”

Saar noted that much of the innovation in this area likely will come from an XR accelerator that can integrate many functions. “The XR accelerator that connects the external world or the cameras does interpolation of cameras, takes the information from the movies or gaming or whatever from the XR processor, and overlays that in. This is a heavy XR-accelerated SoC, and for that we do see designs coming in. They require very-low-power abilities on USB, MIPI receivers, transmitters for the camera, and multiple ports for display. Low power is important. Heat is an issue. Power is always an issue. But we do see the market moving ahead.”

Many of these chips will be customized for the particular environment or use case, and most of the companies that are vying for leadership in this space are are designing their own silicon for the heavy lifting. In Meta’s case, for example, the company has opted for a 3D-IC because it said the compute intensity will not be as high for augmented reality as for virtual reality, where the entire background has to be constantly refreshed. As a result, the thermal issues that have slowed full 3D-IC adoption will not be a major factor, and distances that signals need to travel will be shorter, with less resistance and capacitance.

Opportunity filled with unknowns
As with most new market opportunities, the metaverse is an evolutionary concept with non-evolutionary implications. Its lack of definition is roughly equivalent to the edge computing concept, which most chipmakers saw as a broad opportunity, but one that took nearly a decade to unfold. The edge is still being built out, and it’s not clear who will be the big winners in that space.

But the edge also will play an integral role in the metaverse, and vice versa. “Today, we are seeing many of these devices composed mostly of the display screen, cameras, mics and other sensors in the headset, but all the compute typically happens in a ‘belt pack’ or compute pack. Over time, we will see this all merged into a standalone headset. Also, we expect headsets will become a lot more ergonomic, aesthetic, and non-invasive,” said Borkar. “We have been so spoiled by our smartphones that the expected battery life would need to be in a similar range of 8 to 12 hours without a recharge when these AR/VR devices become mainstream. It is unlikely an average person would be happy to use these devices if they have to charge them frequently or be required to carry around a power bank.”

These devices will also need scalable architecture as well as capabilities to support multi-core clusters and multithreading, Borkar says. “This is important when workload distribution is needed to address different performance requirements of algorithms,” he said. “For example, you can have lightweight AI networks running in real time with high FPS, and at the same time, another lightweight AI network also running in real time but at a much lower FPS, and then a third network that is compute heavy but operates infrequently or in a burst mode. The architecture needs to be able to juggle these workloads efficiently between multiple clusters or architectures at runtime to provide the best power and performance profile for the user.”

Speaking of FPS, Borkar notes that creating a device that doesn’t make people nauseous is a complicated endeavor. “There are several factors with regard to refresh rate,” said Borkar. “There is refresh rate on the display, head tracking and rendering pipeline, and overall latency. On the display side, it is often recommended that each display screen should display 1080p at a minimum with a refresh rate of 90Hz. Feeding into that, the head-tracking algorithms must be able to track the user’s head movement seamlessly and accurately. Today’s SLAM algorithms using sensor fusion from cameras, IMU (inertial measurement unit) and other inputs can accurately track 6DoF positioning at a minimum of the camera frequency, which is typically 60-90 FPS. The algorithms can be further optimized to produce pose estimates at 1KHz by relying only on IMU data. Most graphics pipelines typically run at 60-90Hz as well.”

“Therefore, to consume this 1KHz pose information, advanced image processing techniques of space-warps and time-warps are implemented to create intermediate images (without rendering entirely new images), giving the appearance of a very fast rendering pipeline. Last is the latency, which is usually measured as MPT (motion to photon) latency, or the time from the measured movement to it being registered on the screen. All these factors coming together are important to prevent a user from feeling nauseous,” he said.

Synopsys’ Saar predicts the majority of metaverse-related semiconductors will be used in smart eyewear devices. Those devices will likely be used as sort of an extension of a smartphone, at least in the near future, he says. The tandem nature of an eyewear-smartphone combo resolves some of the technical challenges of metaverse devices. “It will project things to the eye so you can see that you have a message or that somebody’s calling,” said Saar. “The device itself can take some pictures, but not much beyond that. Most of the time it’s powered down completely and then turns on only when there is an interrupt. That means all the computing power comes from the phone, and the connection from the phone to the goggles is mostly for information and image capturing. I think of this as a lightweight metaverse.”

Conclusion
To be sure, there is much about the metaverse’s underlying hardware that remains unknown and unpredictable. Frank Schirrmeister, vice president of solutions and business development at Arteris IP, points out that the tech world has yet to come to a consensus on even basic inquiries regarding what the metaverse actually is. Tentative answers lead to even more questions. “How do you collect the data? How do you secure the data? How do you make sure that people don’t record things they don’t want to record?”

Regardless, Schirrmeister says the optimization of the data center component of the metaverse will be critical, along with building the appropriate infrastructure for hyperconnectivity. The monetization models might look more like something from the defense contracting world. “It’s no longer just a phone or device through which you want to get content to consumers, but you want to make it part of the bigger contract and perhaps a bigger service,” he said. “It could be one big project where you basically sell the hardware, you sell the installation, the building, you sell the software customization, and then you even sell the maintenance of data throughout that process. You sell this as a big bundle project. That’s a very different world.”

Related Reading
Wearable Electrotactile Rendering System W/High Spacial Resolution, Rapid Refresh
A new technical paper titled “Super-resolution wearable electrotactile rendering system” was published by researchers at City University of Hong Kong (CityU) and Tencent Technology’s Robotics X Laboratory.
Building The Metaverse, Part Two: The Technology
The metaverse will touch every aspect of technology development, from personal devices to cloud networks.
Edge-AI Hardware For Extended Reality
Building The Metaverse, Part One: The Vision



Leave a Reply


(Note: This name will be displayed publicly)