Self-driving and almost self-driving cars are on the road. Now they need to be refined and tested.
Wikipedia describes ADAS (advanced driver assistance systems) as systems developed to automate/adapt/enhance vehicle systems for safety and better driving. Safety features are designed to avoid collisions and accidents by offering technologies that alert the driver to potential problems, or to avoid collisions by implementing safeguards and taking over control of the vehicle. Adaptive features may automate lighting, provide adaptive cruise control, automate braking, incorporate GPS/ traffic warnings, connect to smartphones, alert driver to other cars or dangers, keep the driver in the correct lane, or show what is in blind spots.
It is a fast-growing industry that promises to save a lot of lives. The current state and expectations of the technology’s future had Volvo outline a vision that no one would be killed or injured in a new Volvo by 2020.
Since ADAS technology is built upon much of the same electronics and software foundation found in mobile and consumer devices, it is no surprise that Silicon Valley has become the epicenter of ADAS development. Google is known for its self-driving cars, with a stated mission to enable everyone to get around easily and safely, regardless of their ability to drive. Living and working in the center of Silicon Valley myself, I have witnessed Google’s self-driving cars multiple times. While you can definitely see a lot of the Google self-driving car prototypes, it seems like they won’t be available for purchase for a while. In the meantime, however, multiple car companies have been rolling out “intermediate” autonomous driver assistance systems. Tesla released its autopilot functionality in October 2015 and ever since, YouTube has been flooded with footage of Tesla owners trying out the technology for themselves.
I have experienced the advantages of ADAS technology with my 2015 Subaru Outback. While not as spectacular as the Tesla, Subaru’s EyeSight technology is absolutely great. Two cameras, one on the left and one of the right of my rearview mirror, alert me to objects that are too close to my car, but only if I am not braking yet, so it is much less obtrusive than what I have witnessed in other vehicles. Plus, the adaptive cruise control is very helpful, especially during long drives or in heavy traffic on the highway. The system allows me to let my feet rest, while the car does the braking and accelerating.
So how do all these different ADAS systems get developed and tested? Similar to many software-driven electronics products, the use of prototyping is at the heart of getting the system right. Both virtual and physical prototyping are used by semiconductor companies, tier-ones and OEMs. Safety is an integral element to these systems and carries with it massive testing requirements. Virtual prototyping provides a target for early software development and the ability to perform fault injection testing. Physical (or FPGA-based) prototyping enables software development and system validation in context of the real world interfaces.
Visit the Synopsys booth (4-360) at Embedded World from Feb. 23-25 to see our prototyping solutions and how they help develop and test ADAS systems. Demos of our VDK for NXP’s S32V200 ADAS SoC and HAPS-80 physical prototyping system running an embedded vision processor to detect speed signs will be shown.