Why there is growing concern about L3 and L4 autonomy.
There is a growing consensus in the semiconductor industry that SAE Level 3 and Level 4 autonomy will be full of unexpected hazards.
At a number of recent conferences in Silicon Valley, experts from all parts of the semiconductor industry have voiced concern about those middle steps between assisted driving and full autonomy. This isn’t the public position taken by carmakers and Tier 1s. The race to full autonomy is underway, and the first one to the finish line wins big.
But how to get there isn’t so clear, despite a well-organized and highly detailed blueprint. In fact, there is a fundamental disconnect between the makers of electronics and the carmakers, and that divide appears to be deepening.
As anyone familiar with the various SAE levels of driving automation knows, a level 3 vehicle can steer, accelerate and slow down and pass cars without a human involved, and it is supposed to be able to maneuver around obstacles. In some cases, the driver in a level 3 car can take their hands off the steering wheel. A level 4 vehicle, meanwhile, should be able to drive itself on some roads. Level 5, of course, is total autonomy, where there might not even be a steering wheel.
There are several problems with this approach. First, individual subsystems do not make a system, not matter how well they are planned out. An electronic control unit for avoiding objects on the road may work perfectly well with another ECU that controls acceleration and deceleration, but there are a lot of variables these systems cannot account for. For one thing, carmakers have been so secretive about their technology that they may not work the same as cars made by other OEMs. And while they both comply with the standards, those differences may be significant.
Second, failures happens. No matter how robust a system, even if it is tested for 175°C and simulated under every possible condition known in this galaxy, it can still fail. It’s not clear yet that everything is in place to fail as gracefully as the ISO 26262 guidelines indicate. Dirty or degraded sensors, impurities in materials, poor packaging choices and a long and complex supply chain create a lot of points for potential failure. In a hands-off driving scenario, this can be lethal.
That leads to the third point of failure, which is human unpredictability. Not all cars on the road will be at the same level of autonomy, and not all of them will have the same features. And even if they have the same features, they may not be of the same quality or of equal responsiveness. If a driver is allowed to sit back, text, read, listen to music, their response time for taking over will be severely diminished. So what happens when a car runs a red light or skids on black ice? In theory, a Level 3/4 car should be able to avoid these problems. But now add in some dirty sensors, poor communication with other vehicles or infrastructure, or any other imaginable obstacle from a rockslide to a gamma particle flipping a bit or disrupting a 7nm circuit or a critical signal.
While most accidents in cars are the result of human errors, simply handing off driving to a collection of electronic subsystems doesn’t necessarily eliminate or even reduce the number or severity of accidents. Adding more safety features such as lane departure control and hazard avoidance will certainly help, but that intermediate step is a lot trickier than it appears. And despite the marketing literature, and the logic in vehicles has a long way to go to truly comprehend that.
Lets be honest about the true driving force behind self-driving cars. It has little if nothing to do with reducing accidents or increasing safety. The true driving force is pure profits for the auto industry. Ultimately they will lobby for governments to mandate that everyone will have to buy self-driving cars. Pure profits. Plus profits from maintenance of all of the buggy electronics. $$$$$$$$$$
In spite of the 100 year history of commercial aviation, 70+ years of artificial horizons and flying on instruments, 40 years of computerized autopilots, careful training & licensing of pilots, Boeing 737 MAXes still crashed because of failed sensors! And they think car makers and drivers will be better than that in 2 years?
The true driving force behind any industry, company or business is profit.