Knowledge Center
Navigation
Knowledge Center

Radar

popularity

Description

Radar is an acronym for RAdio Detection And Ranging, a technology that is both well entrenched and well understood by most engineers. In the past, radar has been pushed aside due to its lack of resolution. But increasingly it has become a poster child for what’s changing in vehicles. Unlike cameras and lidar, which are based on optical principles, radar is based on radio wave propagation.

A classic radar processing chain involves a multi-antenna front end (multiple input, multiple output, or MIMO), which transmits radio waves and receives data such as range FFTs (Fast-Fourier Transformations), doppler FFTs and angles (azimuth and elevation), which are then extracted and processed on a processing block, such as a DSP or hardware accelerator. A 3D point cloud is then generated, which tells the vehicle where it is with respect to the surrounding environment, and identifies the various objects around it within that frame.

Radar is a cost-effective way to complement cameras. In a primary compute channel, camera and radar multimodality can enhance perception performance in challenging conditions. At SAE Level 3 and above, high-definition radar (and/or lidar) also can be for redundancy. In a few years, SAE Level 3- and Level 4-capable cars may ship with as many as nine radars, including those used for interior sensing.

Depending on the type of radar that is instantiated on a vehicle, the number of ADAS functions will differ. For example:

  1. Long-range radar (LRR) will enable functions like object detection at longer distances (e.g., 300m over a narrow angular region) and can help in automatic emergency braking (AEB) collision warnings and adaptive cruise control.
  2. Medium-range radar (MRR) can typically detect objects up to 150m and has a wider angular region, which helps with cross-traffic alerts or vehicles approaching crossroads.
  3. Short-range radar (SRR) has a very wide angular region but cannot see very far. It’s used for functions that are closer to a vehicle, like cyclist and pedestrian detection, rear collision warnings, lane change assist, etc.

Radar is a key aspect in ADAS applications, which are growing both in popularity and mandate, even though the mandates don’t indicate how ADAS is to be implemented. Those could include all camera-based systems, or a mixture of cameras and radar. The number of radars in vehicles varies, depending on the OEM and the application.

Scalable radar schemes also are evolving alongside of traditional ones. Radar is a mature technology, but within automotive and especially for autonomous driving, radar itself is not sufficient, so there is a need for developments and an effort to increase the performance of the radar. Radar’s benefits stem from the fact that it senses electromagnetic waves and measures the reflection of those waves based on the speed and the delays. And because it’s electromagnetic, it is not sensitive to light or fog or rain. But because the information is poor compared to camera or lidar, the industry has put a lot of effort in developing next generation 4D radar, which allows height to be measured, as well.

How radar will hold up over time with increasing intelligence in vehicles remains to be seen, particularly with an increasing number of AI-driven advanced features making their way into automotive architectures. But it serves as proxy for how much change is still to come in vehicle.

Radar, cameras, and lidar increasingly will be connected into other systems, all of which will need to be adjusted depending upon whether vehicles offer driver assistance, limited self-driving capabilities, or full autonomous driving. Touch screens, for example, need to be managed very differently at each of those levels.

Sensor fusion involves integrating multiple types of sensors into a single chip or package and intelligently routing data to wherever it is needed. The primary goal is to bring together information from cameras, radar, lidar, and other sensors in order to provide a detailed view of what’s happening inside and outside of a vehicle.

Meanwhile, software-defined vehicles may need software-defined radar.

Hardware-agnostic software-defined radars are one of the key elements that traditional carmakers need in order to meet the Automatic Emergency Braking (AEB) mandate finalized in2024 by the U.S. Department of Transportation’s National Highway Traffic Safety Administration (NHTSA), known as FMVSS No. 127.

Software-defined radars add potentially lower-cost and scalable solutions to meet FMVSS No. 127. These hardware-agnostic software-defined algorithms are promising because they aren’t designed to run on a specific piece of hardware. Given that the new regulation applies to entire fleets, from low- to high-end models, software-defined radars could answer some cost concerns that nag OEMs.

Transitioning from hardware-based radar to software-based radar poses big changes to a hardware-centric industry with a supply chain built on a vertical structure. In the current ecosystem, semiconductor companies function as tier twos. Tier ones develop “a box” that contains tightly integrated software and hardware, and OEMs select the box they want.

But with hardware-agnostic software, this hierarchy of relationships begins to unravel. Software-defined radar vendors are eager to work with different tiers across the whole automotive ecosystem. Software ownership could become complicated, however.

 

 

Multimedia

Sensor Fusion Challenges In Automotive

Multimedia

New Approaches To Sensors And Sensing