Data Fusion Scheme For Object Detection & Trajectory Prediction for Autonomous Driving


New research paper titled "Multi-View Fusion of Sensor Data for Improved Perception and Prediction in Autonomous Driving" from researchers at Uber. Abstract "We present an end-to-end method for object detection and trajectory prediction utilizing multi-view representations of LiDAR returns. Our method builds on a state-of-the-art Bird's-Eye View (BEV) network that fuses voxelized featur... » read more

Multi-View Fusion of Sensor Data for Improved Perception and Prediction in Autonomous Driving


Abstract "We present an end-to-end method for object detection and trajectory prediction utilizing multi-view representations of LiDAR returns. Our method builds on a state-of-the-art Bird's-Eye View (BEV) network that fuses voxelized features from a sequence of historical LiDAR data as well as rasterized high-definition map to perform detection and prediction tasks. We extend the BEV network ... » read more