Sensors Enable ADAS

Advanced driver assistance requires a careful and complex balance of hardware, software and security.

popularity

Under the hood, cars of today look nothing like those of a few decades ago. There are sophisticated safety and drivetrain monitoring features, software for interpreting and interacting with the outside world and modifying the inside environment, and a host of features that might have seemed impossible or even ridiculous in the past. And there’s much more to come.

Advanced driver assistance systems (ADAS) are at the heart of these advancements, and there is a tremendous amount of work happening in the industry today aimed at providing drivers with unprecedented levels of safety, comfort, and infotainment. A number of sensors make ADAS possible, including image and camera sensors for vision-based features, ultrasonic sensors for short-range features like parking assist; as well as radar sensors.

All are required, given the variety of use cases. On a bright, sunny day, having a vision-based pedestrian detection scheme works really well. At night, or when it is raining or foggy, a vision-based scheme with only image sensors does not. This is where radar or LiDAR-based sensor schemes come into play. Under different driving conditions such as rain or snow, the driver needs pedestrian detection or cruise control based on long-range or short-range radar. These are all ADAS, but they will employ different kinds of radar-based SoCs for ADAS versus just an image-processing based SoC. Both chips will be there, depending on the application.

Ron DiGiuseppe, senior strategic marketing manager, solutions group at Synopsys noted that ADAS is where a lot of the action is in semiconductor design these days. It’s also the fastest growing application within automotive, according to market researchers at IHS Automotive, who forecast that ADAS systems, including adaptive cruise control, blind spot detection and lane departure warning will grow strongly, some functions at 25% per year or more.

But if you talk to five different Tier 1 vendors developing these systems, they’ll all define it differently. “Usually they manipulate to make it look as if they own all the market share, so some include ultrasonic, some include vision surround, and some don’t,” said Andy Macleod, director of automotive marketing at Mentor Graphics.

In support of ADAS, other big changes have occurred within network architectures in vehicles, said Andrew Patterson, director of automotive business development at Mentor Graphics. One technology is A2B, an audio bus technology that allows audio signals from phones, MP3 players, and warning messages to go down a single cable in the vehicle, so if there was a warning for pedestrian detection, or a lane departure warning, the driver needs to be notified.

Another technology being implemented in vehicles for ADAS is Ethernet, particularly in its twisted pair form, given its high bandwidth capability to collect video images, transmit data at high speed, and low cost to implement.

Fundamentally, ADAS is highly dependent upon receiving signals accurately, quickly, and with low power, but not everything has to be done at a long distance. Some of these communications can be over a distance of inches or several feet, which means there needs to be a variety of technology ranging from near-field communications to mid-range and long-range communications, each with a different level of accuracy

“If you take NFC out of a chip and create a separate chip set, with active load modulation, you can go one or two meters,” said Jawad Haider, product marketing manager at Marvell. “This is a big difference from using your smart phone at the airport, where you walk through the screening area and it doesn’t match up. If you can pair this with a secure element outside the chip, you can run different certifications.”

Part of the solution here involves different antenna technologies for different purposes, as well. Antennas have come a long way over the years, but they have come even further in the past couple years. “They used to be a very long coil,” said Haider. “You can now build one or more into a smart watch.”

Sensor fusion
Given that there are now multiple sensors in a vehicle, different types of data will be coming into the vehicle and must be sorted out to give the driver the safety information they need. Here, sensor fusion is the name of the game. Because there are different kinds of radar sensors and image sensors, there is an effort to centralize these ADAS functions, which in many applications are distributed within the car. These include radar ECU modules, image processing ECU modules and back ECU modules.

“That’s a lot of distributed SoCs, and there is a trend to consolidate them in one module, in which case you have a lot of sensor data from different sensors, and to handle all of that sensor data input — that’s sensor fusion,” said DiGiuseppe. “We want to consolidate the data.”

One strategy that could be used here would be to offload the host processor to a separate embedded processor so it doesn’t have to manage that data.

Pierre-Xavier Thomas, design engineering group director at Cadence, agreed. “As vehicles grow in complexity, there will be increasing integration of heterogeneous sensors, especially in the application stack where, going forward, you see more and more applications that are integrating different types of sensors for different types of applications for things like safety and comfort. With this now comes a system challenge of putting together the sensors, and system optimizations of those sensor integrations, where that puts some constraints on the sensors because you have to integrate them specially.”

At the same time, the level of quality and accuracy must be considered along with the power, because while more power is great, you still have the limit of the physics and the cost of the car. So the challenge becomes what can you do today to make enough of a difference in the application while still being feasible to make. And with increasingly complex software to crunch the data, the sensor fusion challenge becomes a very sophisticated system and software problem to solve, he said.

“From an architecture standpoint, often when you talk about the sensor, you have some digital signal processing that is very close to the sensor, and that is actually processing the signal and doing a lot of intelligent things — sometimes exactly for better resolution, but also lower power, because you want those sensors to transmit and receive lower energy signals. But because they are lower energy, you need to be able to do more signal processing in order to condition the signal and get the information you want. Usually this digital signal processing aspect of it is very close to the sensor, and then in the car, they are dispatched in the right locations from where you want to send something. Then, those signals coming from different sensors are driven into a central unit, where you have the sensor fusion at a higher level of the stack where you make some decisions, so this is where there is another layer of software. This is also where there are a lot of safety requirements because this is where the decisions are made. Many times today, it is the car OEM developing the sensor fusion software because it provides some user experience and they choose how they want to configure the system. Once they figure out how to get more sensors into the car, then they have to decide what to do with that information, balanced by the cost impact considerations,” Thomas explained.

Further, Adam Sherer, product management group director for automotive safety in the Systems Verification Group at Cadence, said that sensor fusion for ADAS systems is a complex mixed-signal problem with ASIL-D safety requirements. “This implies the need for careful requirements development, system modeling, and verification traceability to ensure high initial and lifetime system quality.”

In many ways this functions like a complex computing environment. As cars are increasingly connected and required to do more, they need all of the standard parts to make that happen. Memory is one of those. But that memory also needs to be able to withstand extreme temperatures, and it needs to be secure.

“What’s changing is the need to do a lot of real-time computation in automobiles,” said Jen-Tai Hsu, vice president of engineering at Kilopass Technology. “Storage is required for those applications, which makes it more and more important to have non-volatile memory. But that memory also needs to withstand harsh temperatures. When you look at OTP (one-time programmable) versus embedded flash, OTP can withstand much higher temperatures. It’s also more secure, and with smart car applications, security is the ultimate concern.”

All of this computing and connectivity adds some new wrinkles to automobiles, as well, notably around security. One area that has not been well thought out is the effects of obsolescence.

“For a cell phone, most of them get replaced in two to four years,” said Paul Kocher, president and chief scientist for Rambus‘ Cryptography Research division. “For durable goods and automobiles, you’re looking at a 20- to 30-year lifespan in many cases. That means whatever technologies get put in, at least from a hardware perspective, cannot be inexpensively changed over that lifespan. And also, the software is not easy to maintain. The development teams and the tools are quite expensive to keep together. So there are a lot of challenges and threats when you put a device online. One is that your device can be breached from the outside. The other is that your device can pose a threat to other devices over the network.”

So far, how great that risk will become is unknown, and ADAS is almost at the starting line for new technology.

“There are a number of benefits to putting your car online,” Kocher added. “You can see maps online and you can get firmware updates. But you also have a set of risks that you create. Ultimately, if those risks exceed the benefit of connectivity, then the online product is not going to be better than the traditional one.



Leave a Reply


(Note: This name will be displayed publicly)