Multi-View Fusion of Sensor Data for Improved Perception and Prediction in Autonomous Driving


Abstract "We present an end-to-end method for object detection and trajectory prediction utilizing multi-view representations of LiDAR returns. Our method builds on a state-of-the-art Bird's-Eye View (BEV) network that fuses voxelized features from a sequence of historical LiDAR data as well as rasterized high-definition map to perform detection and prediction tasks. We extend the BEV network ... » read more

Giving Cars A Bird’s-Eye View


Will the world be a better place in which to live by having autonomous cars driving around us? Or would it be unsafe and scary? Maybe someone was asking such a question even when the first steam-powered automobile capable of human transportation was built in 1769 [1]! As a person who likes driving, I wouldn’t like to have a ‘fully’ autonomous car, but I would like to get some assistanc... » read more