What needs to be tested, and what’s the best way to make that happen?
While self-driving vehicles are beta-tested on some public roads in real traffic situations, the semiconductor and automotive industries are still getting a grip on how to test and verify that vehicle electronics systems work as expected.
Testing can be high stakes, especially when done in public. Some of the predictions about how humans will interact with autonomous vehicles (AVs) on public roads are already coming true, but human creativity is endless. There have been attacks on Waymo test vehicles in Arizona, a DUI arrest of a Tesla driver sleeping at 70mph on a freeway, and other Tesla hacks using oranges and aftermarket gadgets to trick Tesla’s Autopilot into thinking the driver’s hands are on the wheel. But are those unsafe human behaviors any more dangerous than the drum beat of technology hype, unrealistic marketing, and a lack of teeth in regulating testing of AVs on public roads, the factory and the design lab?
No wonder some people are angry at being put at increased risk by the unpredictability of experimental vehicles being tested on public roads. So are there ways to test these vehicles without putting people at risk? Verifying the hardware seems to be the focus, but software interacting with hardware defines a car’s behavior in autonomous vehicles.
Throwing rocks at the problem
Sometimes working the engineering problems of automotive verification may feel as effective as throwing rocks at a Waymo car.
“With or without regulations, we have to rely on the car companies, which are running this grand public experiment—and I’m not seeing any public engagement,” said Philip Koopman, associate professor of electrical and computer engineering at Carnegie Mellon University, who specializes in safety testing and validation of autonomous vehicles. “If they’re being transparent as they say they are, there should be reports saying, ‘Our vehicles had X many more bumps than human driven ones and here’s why.'”
Standards and definitions that would apply to the whole automotive market would be great, but chipmakers are still trying to figure out efficient ways to test complex systems using flows and tools designed for low-level verification that have difficulty scaling to flows that cross multiple execution platforms and processors, according to Dave Kelf, vice president of marketing for Breker Verification Systems.
The kind of closed-loop verification likely to be required for AV component testing is beyond the reach of traditional test methodologies and discrete verification. Modeling those processes from the top down could make component-level testing more efficient and scale upward to integrate with functional safety testing of the whole vehicle, but it will take time to get there.
“Right now the focus of the big EDA companies is mostly on hardware that can run software and make sure it’s safety-compliant, not looking at the software itself, which gets more complex as you get more into machine learning,” Kelf said. “At some point it will be more efficient to look at it from the top using a Portable Stimulus scenario that matches a set of requirements you can pass down through verification, which would be more efficient than starting at the bottom and trying to get more abstract as you scale up.”
As the vehicle goes up the scale of autonomy, requirements go up for everything related to ADAS, according to Derek Floyd, director for business development for Advantest. The biggest part of that is integration—adding more types of sensor to give the ADAS, or bringing all the various sensors in the powertrain together so the ADAS or another microcontroller can keep power use efficient, or to encrypt performance data to keep the whole car from being hacked, for example.
“You’d want to be able to train the electronics for more efficient control and preventive maintenance, so the machine-learning logic and decision-making wouldn’t just be focused on what to do with the steering wheel to handle a particular turn,” Floyd said. “Managing performance and power would be an integral part of that decision. As for personal comfort for drivers, performance integration—when it comes to testing there’s not a lot of difference in what you’re looking for. You’re still driven by a focus on safety, but they are looking for parts per billion in terms of expected failure rate, so test consistency and confidence has to be pretty high.”
Simulation and testing of how an AV drives isn’t nearly that concrete or detailed yet, so progress on standard performance metrics and standard specifications has been relatively slow so far, according to Chad Partridge, CEO of simulation-testing provider Metamoto.
Most AV developers are moving as quickly as they can make progress in solving a complex set of problems, including developing the functional and safety testing methodologies and technology to demonstrate that a particular solution actually works.
“There is some convergence on test tracks, and a lot of groups have said they’re interested in sitting down to figure out a consistent direction,” Partridge said. “The greatest progress I’ve seen is from SAE, which is circulating a number of documents about specifications and new techniques for test and verification.”
But industry standards need to play catch up. “There are standards out there. JEDEC standards, ISA norms, IEEE, standards, and so on. But some of them have been defined 30 years ago,” said Roland Jancke, head of the department for design methodology for Fraunhofer’s Engineering of Adaptive Systems Division. “In the functional safety domain, it’s set in these standards that you need to give a failure rate for your component. But the failure rate that these suppliers are giving comes from tables that have been set up some 30 years ago. So they are no longer valid or no longer meaningful, and therefore also these standards need to be improved, need to be adapted to the technologies that we have today.”
Hiding behind older standards is sometimes a tactic. “Always the technology provider can say, ‘Well, we did all we have to do according to the standards,’ but it’s still failing because the standards is not really what we need today,” Jancke said. All the industry players involved in automotive—including semiconductor designers and manufacturers —need to work together on improving also the standard test procedures. And there is some progress on improving standards. “There is work on that from OEM side, and also from the foundry side. They are sitting together and improving these standards.
Testing doesn’t stop when AVs finally hit the road in whatever form. “The other critical need is to ensure that ICs are regularly tested over the lifetime of the part to ensure any degradation is detected and can be mitigated to avoid unsafe situations from arising,” said Brady Benware, senior marketing director for the Tessent product group at Mentor, a Siemens Business. “Especially challenging will be the move to level 4 and 5 autonomous driving, which will eliminate driver fallback in the case of a failure of the autonomous drive system.”
Source: SAE International https://www.sae.org/standards/content/j3016_201806/
The failover challenge
One thing that remains unclear is how failover systems will be classified and tested. The goal of ISO 26262 is to be able to guide an autonomous vehicle off the road safely–the so-called “graceful failover”—and that may include other systems within a vehicle that are not considered safety critical today. This is particularly true for infotainment systems, which may serve as a less robust and powerful control source. Automakers have been pushing in this direction because it reduces the number of redundant parts, but testing of these systems has never been at the level of other systems.
“With vehicles moving quickly toward autonomous driving, the consumer wants to have the same security, reliability, and high-performance of the connection in the car that they have at home or in the office, and semiconductor providers are working overtime to meet these,” said Avinash Ghirnikar, director of technical marketing, Connectivity Business Group at Marvell. He said 802.11ax will have a strong role in this regard, because it addresses connectivity inside the cabin as well as to the outside world. “This can meet the requirements for in vehicle infotainment inside the cabin or it can also meet the requirements in a telematics unit, which is actually wireless connectivity outside of the car. The challenges in an automotive environment are huge, and that’s where the MIMO also comes into play. MIMO provides a much more robust connection between two endpoints. If you have a MIMO in your car, all of your media distribution and so on, also gets to the next level of reliability and robustness.”
But does that now have to be tested at the same level as other safety-critical systems? And how does one go about testing the integrity of over-the-air updates for any of these systems, which will be required because autonomous and semi-autonomous systems need to communicate with each other using compatible algorithms.
“The first thing you need to consider is whether you are even updating the correct device,” said Martin Scott, CTO of Rambus. It gets more difficult from there, too. “All of this has to be secure, resilient and low cost.”
These systems also need to be architected to include those updates. “This is a mechanically focused space,” said Lakshmi Manyam, vice president of Arm’s automotive unit. “There are body electronics, mechanical seats, but there has not been much evolution in software. With in-vehicle infotainment and new ADAS and autonomous features, you need to build headroom into the processing. They need to be designed with over-the-air updates in mind. You will see that come into the market in the future.”
Conclusion
Much of this is brand new technology, and that in itself raises all sorts of issues. Verification and testing using existing methodologies, tools and approaches has worked well for semiconductors over the past few decades, but how well it works with systems that are under extreme stresses and AI systems that learn as they go isn’t known at this point.
“This is the beginning of creating an understanding of these issues,” said Burkhard Huhnke, vice president of automotive strategy at Synopsys. “It might be a core competence of car companies in the future. This is changing. There are concrete examples of car companies creating processor models using virtual prototypes. They have started to use hardware-software co-design for automotive electronics. But methodologies like testing of algorithms need to be highly automated if we’re to get to the next level, which is self-testing and repair.”
—Susan Rambo and Ed Sperling contributed to this report.
Related Stories
You’re all making this too complicated. An RFID tag in each vehicle’s annual registration tag would eliminate 90±% of the required onboard CPU/GPU processing.
I like to think about AI in terms of who is smarter, me or the machine. When I make a judgement about a person standing alongside the road I use caution and decided if they will step into traffic or not, even away from a crosswalk. Any driver does this multiple times in a trip. Can AI do it? Or will it act like some drivers that claim it was the pedestrians fault they were run over.