Knowledge Center
Navigation
Knowledge Center

Sensor Fusion

Combining input from multiple sensor types.
popularity

Description

Highly sensor-dependent systems, such as automotive and robotics, now rely on many sensors using multiple sensing approaches (like radar, lidar, and cameras) in concert. Each sensor mode is based on different physical principles, different goals, and different modalities. They all have a distinct purpose. Some of those overlap, but they all need to function separately, as well as together.

Sensor fusion is the practice of bringing all the reported information from these different sensors together to form a complete model.

How this is handled is still evolving, and there are many proprietary solutions. Some approaches include collecting raw data and processing it centrally, using high-performance AI compute. Other approaches use smaller, lower power inferencing solutions custom tailored to a particular sensor’s output. In this method, the information has already been interpreted by the sensor modules before being combined centrally. This is sometimes referred to as object fusion.

Another aspect of sensor fusion is determining whether sensors are reporting accurately. If conditions hamper performance of one sensor type but not others (fog, say, creates major problems for camera-based sensors but poses no trouble to lidar), can greater reliance can be put on the sensor reporting with the best performance, or does the degraded sensor output create potential safety issues?

Architectures & hardware

One of the big challenges in sensor fusion is competing software architectures. That includes everything from AUTOSAR adaptive to open-source software like Autoware, proprietary solutions from Tesla, Waymo, and Daimler; and commercial platforms from Nvidia and Intel.

There also are competing hardware platforms from companies like Nvidia, Intel, Visteon, and Aptiv, among others. All of these companies are building their own hardware platform, so there is no standard at present. Proprietary AI chips, hardware accelerators, and interface standards play a further role in complicating the landscape.

Multimedia

CXL Vs. CCIX

Multimedia

Simultaneous Localization And Mapping

Multimedia

Where Is The Edge?

Multimedia

Inferencing At The Edge

Multimedia

Tech Talk: Sensor Design