Progress And Chaos On Road To Autonomy

Development of autonomous vehicle technology is happening everywhere, but not as quickly as the hype would suggest.

popularity

Progress in the development of fully autonomous vehicles is incremental and slow, but not for lack of effort.

Research and development in self-driving cars is under way all around the globe, from the biggest automotive manufacturers and their Tier 1 suppliers to companies not traditionally involved in the automotive industry. Add to that fleets of startups working on sensor technologies and researchers in academia, and it resembles an earth-bound space race extending over multiple technological fronts and use cases.

Three distinct types of companies are competing in ADAS and automated driving, says James Hines, director of automotive research at TechInsights. First are the traditional automotive manufacturers, the OEMs and the Tier 1 suppliers. Next are what he calls emerging smart mobility companies, such as Uber, Lyft, and Waymo. And then there are the technology providers, the semiconductor companies, including Nvidia, Intel/Mobileye, and NXP Semiconductors.

Which companies will win the checkered flag in the race to develop autonomous vehicles? It is likely to be a team of contenders, each providing an essential element to the technology. One thing we know for sure, the finish line is “at least 10 years down the road,” says Hines. “The gestation period in automotive is very long.”

It’s also incredibly complicated, frequently secretive and highly competitive, and geographically dispersed, which doesn’t help solve some of the most glaring problems in assisted and autonomous vehicles.

“Right now the recall rate is about 5% on electrical issues,” says Burkhard Huhnke, vice president of automotive at Synopsys. “This is unacceptable. We have to increase the robustness of these systems, and we have to begin with the smallest part, which is the semiconductor.”

This is easier said than done, of course, and not just for technological reasons.

“Safety islands already exist on an SoC, but if they’re not integrated well there is no availability to other systems,” Huhnke says. “We also have a problem in software. The OEM receives soft code from the Tier 1s, but they’re not aware that it’s from 50 different sources. A lot of that is open-source code. How do you ensure quality for that? That’s a huge problem because you cannot fix it early enough.”

Supply chains
The crowd of disparate participants infuses uncertainty throughout the supply chain. “The challenge is that you’re designing an autonomous vehicle. You don’t have the carmaker going to the Tier 1s, going to the semiconductor house, and then to the supply chain,” says Andrew Macleod, director of automotive marketing at Mentor, a Siemens Business. “All of a sudden, you’ve got startups and universities all over the U.S. and all over the world, with very specialized bits of IP, [and] you’ve got large companies in other segments that are moving into automotive.”

That makes managing a supply chain extremely difficult. “How do you develop a hugely concept autonomous vehicle, looking at all the IC simulation — we’re talking about simulating at the vehicle level — and share all that data through the supply chain? That’s a huge problem,” says Macleod. “If you want to design a safe, reliable autonomous vehicle, it can’t be done in isolation. You can’t just go design the powertrain, the electric powertrain, go design some sensors, and go design something else, and then expect the carmaker to integrate it.”

Sharing all the data across the supply chain, “that’s a technology trend driven by a market trend,” he adds. “These are the sorts of challenges.” Once they can be worked out, “everyone in the supply chain can make money.”

Macleod has been talking to customers in the past six months about what they need for self-driving car technology. The questions they most often ask are “How do we develop these autonomous vehicles?” and “How do we close that chasm from R&D to actually turning this into a valuable business?”

He points to simulation, from IC design through developing a sensor fusion box, as key to advancing connected-car technology. Vehicle-level simulation also can be enabled with edge cases and use cases, Macleod notes. “We can create all kinds of scenarios,” he says. Simulation technology can help the industry “move from R&D to mass production.”

Several things to remember when designing systems are longevity of the final product versus the components used in the test product. For instance, graphics processing units can be used in test vehicles, yet those GPU chips typically won’t last for 10 to 15 years, which is what’s needed for the self-driving cars of the future, according to Macleod. “If it’s under the hood, it’s got high temperature, and electrical noise, and heavy vibration,” he says. Simulation can help with the system integration needed for autonomous vehicles, which could include putting LiDAR sensors in the headlights.

All of this can help shave dollars off the final price, which is a key issue for autonomous vehicles. “If you look at the aviation industry, redundancy is the key there,” said Synopsys’ Huhnke. “But adding redundancy can be quite expensive, and if the average cost of a car is $15,000, you can’t afford that. You’ve already got $3,000 worth of software in there, and electronics are becoming a huge amount of the car. It’s not just the hardware in a supercomputer. You need reliable software, as well.”

How all of that gets put together to create an affordable, reliable system isn’t clear yet. The basic architectural design philosophy of autonomous cars is still up in the air. “The whole autonomous architecture is being considered as either a centralized system, with a single brain doing the sense and perception processing, or a de-centralized system with more intelligence at the sensors (edge),” says Robert Day, Arm’s director of automotive solutions and platforms. “As more autonomous driving becomes a reality and our cars begin to make more decisions, the amount of compute power required in these vehicles is growing at a rapid pace.”

Compute power has to handle the multiple sensors used around the car to get the most accurate picture of “what the car is ‘seeing’ (LiDAR, radar, cameras, ultrasonic), and then the information is fused together and processed using machine learning and neural networking in order to make the decision as to whether to take action,” says Day. “The processors carrying out this level of compute need to provide the right balance between power and performance for these big tasks, and the energy efficiency and size required to meet thermal and size constraints within the vehicle.”

Finally, with the large amounts of software required for Level 3 autonomy (hundreds of millions of lines of code), common frameworks and platforms are being developed using a combination of open-source and commercial/proprietary software.”

Just keeping track of all of those pieces is incredibly complex. “Each company has their own notion of how they list their IP, whether it’s by foundry, by category or by process node,” says Ranjit Adhikary, vice president of marketing at ClioSoft. “That’s already hard enough in other markets, but in automotive it’s become much more complex. It now involves what techniques you use, because if parts have to last 10 or 15 years, then you have to run under more stringent PVT conditions. It has to survive in Alaska and in the desert. Most of these parts aren’t warranted for more than three or four years.”

Adhikary adds that it’s no longer a matter of choosing IP or parts off a list. It’s now a matrix that is in almost constant flux. “It’s definitely not intuitive anymore,” he says.

Semiconductors
Semiconductor vendors are highly active in kitting out autonomous vehicles with a variety of microchips, many of them system-on-a-chip devices incorporating artificial intelligence and machine learning technology.

The automotive environment puts special stress on semiconductors. “The move from prototype (server in the trunk) to mass production is changing the architectural requirements of processors in vehicle to be more sensitive to three key areas: Power, thermal, and size constraints;” says Arm’s Day.

Arm is partnering with other companies for automated driving technology. Arm is currently very involved with many companies “designing Level 3 and above autonomous systems, from automakers such as Audi and Toyota to Tier 1 suppliers like Denso,” Day notes. “Of course, a number of our semiconductor ecosystem partners are producing ADAS/autonomous SoCs, such as Renesas, NXP, and Nvidia.”

What’s important to note here is just how many pieces there are, and how many companies are working on their own sets of challenges. Marvell, for example, is focused on the communication within and outside the car.

“Customers expect the same experience that they have at home or at work to be in the car, as well, with regards to security, reliability, and speed,” said Avinash Ghirnikar, director of technical marketing for Marvell’s Connectivity Business Group. “Their latest and greatest smartphone, which has 11ax, works great at home, it works great in the office, but when they bring it into the car it sort of slows down to a crawl. Why does that happen? This is one of the reasons that 11ax will play a critical part. Delivering this kind of a seamless WiFi experience is really becoming challenging in an automotive environment. There are many cars on the road, there are things that you have to deal with in the automotive cockpit.”

That’s only part of the picture, too. “Another thing that has prevented a media experience comparable at home, in the car has been that the 5 gigahertz usage has not been uniform because of all of these restrictions that are there worldwide,” said Ghirnikar. “What we have done specifically to address this issue is that given we have two 5GHz radios in our device, we can dedicate one of them for radar detection. So as the device is operating, our device always has a list of frequencies available where there is no data. So if you’re in a certain 5GHz frequency and you detect a radar, you can instantly switch to one of these non-radar frequencies, and through band steering, tell all of your clients to move to that frequency, and you’re good to go.”

Once again, this is a multi-technology, multi-company, and even multi-country handoff. And it’s one that is potentially highly disruptive across multiple markets.

“The implications of this change are incredible,” says K. Charles Janac, president and CEO of ArterisIP. “You can imagine a world where there are no stoplights and cars communicate with each other, cities are eco-fenced, and you can only take self-driving taxis. If you’re a legacy car company, your dealers have gotten all their money from maintenance, your assembly people are skilled in mechanical assembly, and your purchasing people have long-term relationships that now have to be fractured.”


Fig. 1: Audi’s Aicon concept car. Source: Audi

LiDAR
Alongside the disruption in markets and supply chains are new technologies that are being developed alongside existing technologies. LiDAR is a case in point. The radar-like technology, which is based on light waves, may be just as important as advanced computer vision and automotive radar in realizing the concept of automated driving.

Proof of its value can be seen in which companies buy LiDAR startups. Innoviz has picked up BMW as a customer for its InnovizOne sensor. The Israeli startup has been working on MEMS-based solid-state LiDAR technology for two years. Its InnovizPro sensor is a stand-alone unit that can be added to existing vehicles. ON Semiconductor acquired Cork, Ireland-based SensL Technologies, a developer of LiDAR sensors, silicon photomultipliers, and single-photon avalanche diodes.

Many startups are focusing on developing software and LiDAR sensors, incorporating AI, deep learning, and convolutional neural networks, according to Egil Juliussen, director of research and principal analyst for automotive technology at IHS Markit in the U.S. “The software really is the hang-up,” he says.

The area of LiDAR sensing is exploding right now, says TechInsights’ Hines. “A lot of startup activity in that space, a few established players that have viable working products today, and a whole bunch of others developing capabilities in the LiDAR area.”

More than 50 LiDAR startups are active in the industry, and it’s obvious that most of them will not survive the coming competition to provide automotive manufacturers with the LiDAR sensors for autonomous vehicles. Some LiDAR vendors are diversifying into applications other than automotive, such as drones, mapping, and robotics for factory automation.

Yole Développement sees the automotive LiDAR market increasing from $726 million last year to $5 billion in 2023, for a compound annual growth rate of 43%. The market research firm notes that more than $800 million has been invested in LiDAR companies in the past two years.

“There’s room for four or five, maybe six in the U.S. market. There’ll be two or three in China, maybe a couple in Japan, and maybe a couple in Europe,” Juliussen says. Yet, these are also still the “early days” of automotive LiDAR, asserts Hines, with “a handful of real products.”

Use cases
Industry analysts look toward self-driving cars for ride-hailing applications and autonomous semi-trailer trucks as the vanguard of autonomous driving. They will be among the first vehicles to achieve nearly full autonomy or complete autonomy. The trucking industry continually has a shortage of long-haul drivers, so autonomous semis may alleviate that issue.

Self-driving semis are attracting entries from Daimler Trucks, Tesla, and Sweden’s Einride. TuSimple, a Chinese-American startup, last year took in a $55 million round of private funding, bringing its total funding to $83.1 million. Nvidia is an investor in and a supplier to TuSimple, which is testing three Peterbilt trucks on highways in Arizona and has logged more than 15,000 Level 4 autonomy testing miles.

The next-generation mobility industry might represent an economic opportunity worth $5 trillion to $7 trillion, according to some estimates. Millennial adults will easily adapt to the new economic models offered by autonomous technology, since they are frequent users of today’s ride-hailing services, Mentor’s Macleod asserts. The Boston Consulting Group estimates about 35 million drivers around the world will use car-sharing services by 2021.

Aptiv and Lyft have signed a multiyear agreement to offer a driverless ride-hailing fleet of 30 vehicles in Las Vegas, Nevada. Drive.ai is gathering a fleet of autonomous vehicles in the Dallas-Fort Worth metroplex, ferrying passengers in the community of Frisco, Texas. The Mountain View, Calif.-based startup has opened an office in Frisco to supervise a six-month trial program.

Autonomous technology won’t be confined to land-based vehicles. Buffalo Automation, a spinoff from the University of Buffalo, is commercializing its AutoMate system for self-driving boats, with the help of $900,000 in seed funding from private investors. The startup is aiming at cargo ships and recreational boats, using artificial intelligence technology to coordinate its cameras and sensors.

Unmanned aerial vehicles that operate autonomously are already with us. Skydio is selling the R1 Frontier Edition drone, introduced in February, to the public for $2,499. The Redwood City, Calif.-based startup offers a flying machine for shooting videos, based on Nvidia’s Jetson AI supercomputer platform, with 13 cameras.

Other startups are in the autonomous drone field, along with the University of Michigan, and the Defense Advanced Research Projects Agency (DARPA) is soliciting ideas for its Offensive Swarm-Enabled Tactics program.

San Francisco-based Marble, a developer of food delivery robotics, raised $10 million in Series A funding, bringing its total private funding to $15 million. The startup is looking to enabling e-commerce companies to deliver their products, without involving FedEx, UPS, or the U.S. Postal Service. Its delivery systems have an on-board LiDAR sensor.

IHS Markit’s Juliussen notes that 53 companies have permits from California’s Department of Motor Vehicles to conduct testing of self-driving cars, and some of their data is publicly available through the DMV. “Waymo is far ahead of everyone else,” he comments.

The Waymo subsidiary of Alphabet Inc. has reported it goes 5,600 miles between “disengagements” – that is, the times when the engineer in the vehicle has to take over steering or otherwise assume control of the vehicle under certain circumstances. Other self-driving test vehicles have much shorter distances between their disengagements on California highways, roads, and streets, the IHS Markit analyst notes.

Waymo is “probably in the lead,” he says, given its number of miles devoted to testing. While the DMV’s disengagement data is of interest, “that doesn’t tell the whole story,” TechInsights’ Hines notes.

Waymo will start testing driverless ride-hailing in Phoenix later this year, he adds, covering 100 square miles of the metropolitan area. “This clearly is the way the industry is moving,” Juliussen says.

Waymo, General Motors, and other companies involved in self-driving technology will do test driving in the southern U.S., Atlanta and other locales, because of the more favorable weather conditions. Uber and Lyft are active in developing the software for autonomous vehicles.

“This model then will probably shift to China, probably 2020s timeframe, maybe later, because they’re not quite as far, and it’s much tougher to drive in most of China,” Juliussen says. “It’s going to be much slower in Europe. Assuming it’s successful in the U.S., they’ll probably do similar things there.”

For now, driverless testing vehicles are mostly startups’s running tests on fixed routes. “They’ll have a specific route that the cars can try to do. You’ve seen the testing in a campus type of environment, University of Michigan, and they say they’re going to do some more,” says Juliussen. He says some airports that are talking about fixed routes. “When you come from a mass-transit-type-centric area, this is more the way you’ve got to go, particularly since the taxi unions have more power.”

“You’re not going to see as much driverless ride-hailing in Europe, at least not initially,” says Juliussen. “They’ve got a more restrictive legal area. The U.S. is lucky, because some states are able to do that. Most other countries have national laws, so you can’t just do that in part of the country.”

IHS Markit forecasts 51,000 autonomous vehicles will be sold in 2021, increasing to almost 1 million units in 2025 and to more than 33 million AVs in 2040.

“China will surpass the U.S., fairly early,” Juliussen predicts.

Arm’s Day sees things happening now. “Great strides are being made toward the next levels of autonomy,” says Day. “Level 3 autonomous driving is now reaching high-end production vehicles, such as the 2019 Audi A8. We’re increasingly seeing companies move from prototype to production vehicles, and … Level 5 robotaxis are on the horizon.”

Conclusion
The connected-car ecosystem is growing and thriving. Automated driving technology is on the rise along a number of fronts. There will be business failures and forced acquisitions along the way, confusion about what pieces go together and how easily, how to integrate the different parts, and what technology can be shared. There is even debate about which companies ultimately will rule this market.

Still, the vision for autonomous vehicles is becoming clearer day by day, even if the path forward is littered with disparate parts, evolving standards, and a long list of questions about what is considered good enough to roll out commercially.

—Ed Sperling and Ann Steffora Mutschler contributed to this report.



1 comments

Gil Russell says:

Thanks for a balanced article on a subject that is mostly a riddle wrapped in its own mystery. I’d like to see an article which comments on the following;

AI Audit Requirements (Explainability)
There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018 (now), the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs. The computers that run those services have programmed themselves, and they have done it in ways we cannot understand. Even the engineers who build these apps cannot fully explain their behavior. https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/

—Gil Russell, WebFeet Research

Leave a Reply


(Note: This name will be displayed publicly)