Regulations Trail Autonomous Vehicles

Chipmakers will continue to benefit, but it will be a long road until autonomous vehicles dominate the roads

popularity

Fragmented regulations and unrealistic expectations may be the biggest hurdles for chipmakers selling into the market for self-driving cars during the next few years.

Carmakers and the semiconductor industry have made tremendous progress building real-time vision systems and artificial intelligence into relatively traditional automobiles during the past decade or so. But federal and state regulators have been far less effective in creating or changing existing regulations to address issues that affect vehicles with increasingly sophisticated driver-assist systems, vision, infotainment and automatic-response systems.

“Auto safety is not a box-check,” said Roger Lanctot, director of automotive connected mobility for Strategy Analytics. “The technical challenge involved is greater than the general public has been led to believe. But on the other hand, it is also fairly difficult to certify a vehicle as roadworthy without a reference point you can use to evaluate autonomous systems. “

China is probably the most aggressive country promoting testing of autonomous vehicles on public roads, according to many industry experts. The European Union also is trying to create a consistent framework of rules, but has not yet caught up to Germany, whose framework includes compromises like allowing drivers to take their hands off the wheel and holding the carmaker responsible for accidents.

“In China, regulators can charge into a fully autonomous city and avoid a hybrid situation, where you have autonomous vehicles and those driven by people,” said Burkhard Huhnke, vice president of automotive strategy at Synopsys. “We’re already seeing real tests with autonomous vehicles. You’re also seeing this in a Grand Challenge type of environment, where those are 100% test cars. But the most difficult is the hybrid world, because the average time a car stays in the market is eight years. Within the next eight years you will see more and more of these autonomous vehicles coming.”

But how and where they actually roll out isn’t entirely clear. There are still a lot of issues to iron out, including agreeing on a variety of rules across the carmakers themselves. “If your maximum speed is 10% lower than a competitor’s car, or if your vehicles accelerate full throttle versus smoothly, there will be problems,” Huhnke said. “And right now there are no answers to these problems.”

So while there are major advances in autonomous vehicle technology, the initial applications may be much more limited. “The initial applications will likely be on highways and in green-fenced areas,” said Ty Garibay, CTO at Arteris IP. “There also might be a role for this in places like retirement facilities, military bases, resorts and country clubs. In dense city centers, you may need infrastructure changes.”

This is particularly true in the United States, where regulators have been less quick to adopt regulations, partly due to concerns about driver and passenger safety, and partly due to the structure of the regulatory agencies involved.

Fragmented sense of order
U.S. auto regulation is split between the states, which are responsible for defining who can drive and what driving behaviors are acceptable, and the U.S. Dept. of Transportation (DoT), which sets and monitors the safe design of cars and roadways.

Self-driving cars muddy the waters by combining “car” and “driver” into one entity, to be regulated. But this has rolled out so quickly that regulators were unprepared.

The U.S. Dept. of Transportation’s National Highway Traffic Safety Administration (NHTSA) has been working for years on a vehicle-to-vehicle communications system designed to reduce fatalities by allowing cars to warn each other on approach, and to negotiate passage to avoid collisions. Automakers and other groups within the DoT started suggesting several years ago that new regulations might be in order, but NHTSA officials had trouble walking away from the work they’d already done.

“They’ve been working for years on vehicle-to-vehicle (V2V) communication, which hasn’t happened and is being replaced by autonomous driving to a large extent,” Lanctot said. “They should be focused on automation, but the DoT is a big government agency. It can’t pivot. It already has research money and resources dedicated to V2V, so it has a hard time focusing on self driving.”

Instead, the agency issued an advisory policy in September 2016, entitled Automated Driving Systems. The stated goal was to reduce the 34,092 automotive-related fatalities recorded in 2015 by addressing the 94% of accidents, which it blamed on human error. Rather than law or enforceable policy, however, the paper represented “an initial step to further guide the safe testing and deployment of [highly automated vehicles].”

So far, 22 states and the District of Columbia have passed laws governing some aspect of self-driving cars. Another 10 state governors issued executive orders defining the issue, according to a Brookings Institution analysis.

NHTSA updated that original policy of guidance in September of 2017, reducing discussions of privacy and security and reducing talk about V2V, and explicitly promised to avoid regulation that was restrictive, or that could limit the growth of the market or technical advancement in any way.

It did encourage states to consider legislation and policies governing autonomous cars, and to work together to make those rules consistent but not restrictive, using a National Conference of State Legislatures database of traffic laws to identify patterns. That may leave the U.S. without a single, consistent source of testing and certification for autonomous cars, but it hasn’t dimmed the enthusiasm of its fans.

“When you put a Tesla in autopilot mode, the computer is driving the car and it’s very accurate—just phenomenal,” said Alok Sanghavi, marketing manager at Achronix. “That will get better, especially with sensor fusion. But we’re still not at [SAE] Level 3 in terms of autonomous driving, so we have a ways to go, and you can decide whether you’d still like to have your hands on the steering wheel while you’re on the highway. It would be like trusting any other technology when it’s new, but trusting it with your life.”

Driver-assist systems, emergency braking and other safety features are showing up in production vehicles faster than fully autonomous driving functions, which is appropriate both for their acceptance in the market and maturity of the technology, according to Steve Woo, vice president of systems and solutions and distinguished inventor at Rambus.

“With machine learning and autonomous driving, we’re just scratching the surface, trying to classify objects and make decisions,” Woo said. “That’s important for autonomous driving, but it’s still very early days to make a lot of definitive decisions.”

Still, there are plenty of common expectations that should apply even in very immature technologies, such as the Internet of Things or automated/connected vehicles.

“Whether it’s a sensor or an autonomous car, if it fails you want it to fail safely,” said Vic Kulkarni, vice president and chief strategist of ANSYS’ semiconductor business unit. “That’s a lot more critical in the automotive space, where you’re also operating in real-time, but you also need standards for configuration because you need all the chips to be properly managed through the system.”

No regulations, few standards
A lack of safety standards or implementation guidelines may not have contributed directly, but there were a few surprises in a report about the Uber Technologies test vehicle that struck and killed a woman in Tempe, Ariz., on March 18. The vehicle’s vision systems detected the woman 6 seconds before striking her. But it didn’t slow down until 1.3 seconds before impact, when it decided an emergency braking maneuver was required, according to a preliminary report on the accident published by the National Transportation Safety Board (NTSB) on May 24.

The 2017 Volvo XC90 had a factory-installed automatic braking system on board, but it wasn’t able to follow through on its decision to brake because the ABS was disabled by Uber’s software to avoid erratic driving. The car was equipped with radar, LiDAR, navigation sensors, and a total of 10 cameras that included the system’s forward- and side-facing cameras, plus aftermarket cameras providing views through the windshield, the rear window and one pointed in at the passenger cabin. That camera showed the operator looking down and toward the center of the vehicle several times before the crash.

The operator told the NTSB she was monitoring the performance of the self-driving system, which was designed to have a driver intervene in an emergency. In this case, the driver turned the wheel less than a second before impact, and hit the brakes less than a second afterward, according to the NTSB.


Fig. 1: (Left) Location of the Uber crash, showing the paths of the pedestrian in orange and the Uber test vehicle in green. (Right), post-crash view of the Uber test vehicle, showing damage to the front right side. Source: NTSB

Only one in five American drivers would trust a self-driving vehicle, compared to 73% who said they would be afraid to ride in one, according to 1,014 adults who answered a AAA phone survey posted May 22. A similar survey taken in late 2017 found 63% of American drivers were afraid of self driving cars.

NHTSA has not created firm standards or regulations to try to prevent such incidents, but has conducted defect investigations and ordered some driver-assistive products recalled. It is far from systematic or comprehensive in that effort, however, and “does not have a comprehensive plan that sets clear goals, that establishes when and how it will act, or that indicates how it will monitor progress,” according to a report in November, 2017 from the Government Accountability Office (GAO) study.

Autonomous vehicles do pose complications that could require changes in local roads or laws to accommodate, however, the GAO found. Autonomous systems tend to create traffic hazards by obeying speed limits rather than the speed of traffic, for example. Automated driving systems also have trouble identifying the intentions of pedestrians at crosswalks, the GAO found.

The DoT responded to the GAO report by agreeing to develop a framework of requirements or regulations to address automated vehicles, the first iteration of which will become public in 2019.

There have been a number of bills in Congress to address the lack of specific NHTSA regulations, most recently the Safely Ensuring Lives Future Deployment and Research in Vehicle Evolution Act (SELF DRIVE Act). The bill was passed by the house in September, 2017, over the objections of the National Governor’s Association, which complained the federal act would encroach on state authority. The Senate was working on its own bill in December, called AV START Act, which didn’t get through the objections of members of the Senate Commerce Committee who were concerned about allowing cars on the road with no human behind the wheel.


Fig. 2: Various levels of autonomy in vehicles. Source: SAE International

—Ed Sperling contributed to this report.



Leave a Reply


(Note: This name will be displayed publicly)