Smart Glasses, Augmented Reality, And Time-of-Flight Imaging

3D ToF can bring differentiating features to AR glasses.

popularity

By Edmund Neo and Rolf Weber

Augmented reality (AR) is frequently discussed for providing new, exciting, vision-related products for consumer and industrial applications. One popular AR implementation uses AR glasses. While the global market for smart augmented reality glasses was only 255,600 units in 2020, market researchers predict robust growth and a volume of 8.8 million units by 2026 [1]. The key for a successful adaptation and realistic user motion experience is a seamless and accurate identification of the glasses position and orientation. Time-of-flight (ToF) imaging is poised to play a significant role in this market acceptance. In fact, combined with AR, it could create the next big thing after smart phones and smart watches.

AR today

While virtual reality (VR) is popular among gamers, AR could become ubiquitous because everyone could use it, just like a smart phone and a smart watch. Today, several AR glasses are available, including:

  • Snap Spectacles 3 smart glasses with 3D videos, 3D photos, 3D Effects, and more ($375)
  • Vuzix Blade Upgraded Smart Glasses ($799.99 with prescription optional)
  • Facebook and Ray-Ban Stories (starting at $299)
  • Magic Leap lightweight, wearable computer (Magic Leap 2 coming soon)
  • Varjo XR-3 mixed reality headset
  • Lenovo ThinkReality A3 Smart Glasses ($1499.99)
  • Xiaomi Smart Glasses (announced Sept. 14, 2021)

Today’s common AR glasses features include:

  • Acting as display and control for applications, smartphone, and/or TV on the go
  • Wi-Fi/Bluetooth connection
  • Microphone/speakers (take calls, listen to music)
  • RGB camera (recording, conference call)
  • Iris recognition as password for glasses

Other, not so common, features include:

  • OLED display (video conference call, simple augmented reality)
  • Laser projecting image in eye
  • Inertial measurement unit (IMU) simultaneous localization and mapping (SLAM)
  • 3D-ToF
  • Hand tracking
  • Thought tracking

Many people might remember Google Glass, initially available May 15, 2014. It had detracting issues including privacy concerns and the unattractive appearance of the glasses based on the volume required to house the technology and batteries for adequate battery life. Other issues included weight, processing power requirements, and cost.

While improvements are made to address previously recognized issues, adding time of flight can provide additional functionality to differentiate an interesting concept to a must have product in the future. With improved sensitivity of ToF imagers, reduced power consumption of higher performing micro controllers, designer sunglasses with AR and ToF functionality are already offered for a few hundred dollars.

Potential AR use cases

There are several compelling use cases for AR glasses. Since they can act as a display, they can do everything a phone can do with the added convenience of not having to hold the phone and losing the use of one or both hands. With an ToF imager, night vision could be added to the glasses. A thermal camera can give you heat sensing capability. Today’s compelling use cases for AR include object recognition and object explanation. With object recognition an object (or a person) can be cut out of scene and placed in front of a blurred (the bokeh effect) or an artificial background. AR has also proven its use in learning or in displaying of repair manuals/instructions for service personnel, police, firefighters, security, and other first responders.


Fig. 1: 3D-ToF sensing adds depth and position detection information to AR.

While some applications do not need position detection, its addition makes the AR glasses more attractive to and more desirable for the consumer. Other capabilities could include Zoom vision, taking pictures, and recording video of objects of interest, even while driving (without the driver distraction aspect), so the user has a recording and can reference the situation prior to an accident.

When a pending incident like a car unsafely approaching the vehicle, even from behind, occurs, the glasses could capture that event and provide the legal proof required to confront the offending driver. Also in the vehicle, AR glasses may help drivers to see text without looking down while driving. With a voice to text feature, they can see the text directly in front of them through the glasses. This could help resolve the problem that despite all the great Bluetooth and handsfree speakers on the market, drivers still try to text while driving.

Perhaps one of the more compelling, recently reported use cases involves ToF sensing to detect hidden spy cameras [2]. Applications include locations such as hotel rooms, temporary rentals, restrooms, and more. Using ToF sensing, researchers designed and implemented an app to automatically detect and localize hidden cameras in real-time. With the threat to individual privacy being a global problem, this could be a killer app.

With all the possible additions, AR glasses should be a great platform for the future. This takes AR glass well beyond gaming and applies them to normal life situations. In a few years, the phone form factor could go away – replaced by the smart watch or pocket fob with the display on the AR glasses. While some people do not like to wear glasses, sunglasses seem to overcome that barrier. Glasses that adopt to sunlight and then automatically adjust to indoor vision requirements (another feature) could solve that problem.

The value of depth data

For those not ready to consider and embrace the potential capabilities that 3D-ToF can bring to augmented reality consider this. The “never say never” aspect of 3D-ToF can be countered and even validated by Elon Musk’s long-standing position against LiDAR as a required or even useful sensor in Tesla vehicles for advanced driver automation systems (ADAS) and autonomous driving. His vehement opinion was that radar and cameras would be sufficient and Tesla did not need the 3D sensing that LiDAR provided.

While others, including Waymo and numerous potential suppliers of LiDAR sensors and systems, moved forward towards implementing and making LiDAR viable for vehicle use, Musk continued to be noted for his reluctance to pursue the technology. He was even put in the position of having a war against LiDAR [3]. This year, apparently Musk resolved the conflict and has partnered with Luminar, a LiDAR supplier, to test and develop LiDAR technology [4]. This shows that no matter what position has been established, once a compelling reason/argument is presented, direction-changing decisions can be made.

In AR glasses, the non-visible illumination source, the illuminator in the ToF approach, could be used in a night vision feature. The RGB camera that is there for normal vision/camera purposes does not function in the dark. With 3D-ToF supplementing the camera, the user can walk in the dark and use the other AR glasses features.

Another important feature for AR is object recognition. RGB cameras do this by edge detection. That works with appropriate lighting conditions and sufficient contrast between objects. With its depth information, 3D-ToF provides a more robust solution to these two limitations of RGB cameras.

Wanted: Visionaries to see the future

With rapidly improving processing power, lower power consumption and longer battery life available, now is the right time to consider the cool applications for AR glasses. Superman or a “kitchen sink” class of AR glasses could provide the next big thing for attracting users.

Some of the possibilities have been presented here and hopefully will inspire more and even provide the impetus to implement them. Now is the time for innovation for today’s and, even more importantly, future AR glass visionaries to consider how 3D-ToF can differentiate their products for future customers.

References

[1] https://www.strategyr.com/market-report-smart-augmented-reality-ar-glasses-forecasts-global-industry-analysts-inc.asp

[2] S. Sami et al., “LAPD: Hidden Spy Camera Detection using Smartphone Time-of-Flight Sensors,” https://dl.acm.org/doi/10.1145/3485730.3485941

[3] Brad Templeton, “Elon Musk’s War On LIDAR: Who Is Right And Why Do They Think That?” https://www.forbes.com/sites/bradtempleton/2019/05/06/elon-musks-war-on-lidar-who-is-right-and-why-do-they-think-that/?sh=13d6f17f2a3b

[4] https://www.theverge.com/2021/5/24/22451404/tesla-luminar-lidar-elon-musk-autonomous-vehicles

Edmund Neo is senior product manager of Power and Sensor Systems at Infineon Technologies.

Rolf Weber is principal engineer for Time of Flight Applications at Infineon Technologies.



Leave a Reply


(Note: This name will be displayed publicly)