How Many Senses Do You Need To Drive A Car?

Automotive computing, sensing, and data transport requirements are growing enormously.


The recent AutoSens conference in Detroit left me questioning whether I should turn in my driver’s license. The answer the attending OEMs gave to all the discussions about the advantages of RGB cameras, ultrasound, radar, lidar, and thermal sensors was a unanimous “We probably need all of them in some form of combination” to make autonomy a reality in automotive. Together, these sensors are much better than my eyes and ears. Technology progress is speedy in Automated Driving Assistance Systems (ADAS) and autonomous driving, but we are not there yet holistically. So I am keeping my license for some more time.

I traveled to Auto City to speak on a panel organized by Ann Mutschler that focused on the design chain aspects, together with Siemens EDA and GlobalFoundries. Ann’s write-up “Automotive Relationships Shifting With Chiplets” summarizes the panel well. The conference was a great experience as the networking allowed talking to the whole design chain from OEMs through Tier 1 system suppliers, Tier 2 semis and software developers, to Tier 3s like us in semiconductor IP. Given that the panel had a foundry, an IP vendor, and an EDA vendor, we quickly focused our discussions on chiplets.

Concerning the sensing aspects, Owl.Ai’s CEO and co-founder, Chuck Gershman, gave an excellent presentation summarizing the problem the industry is trying to solve – 700K annual worldwide pedestrian fatalities, 59% increase in pedestrian deaths in the last decade in the US and 76% of fatalities occurring at night. Government regulations are coming for pedestrian nighttime safety worldwide. Owl.Ai and Flir showcased thermal camera-related technologies, motivated by only 1 out of 23 vehicles passing all tests in a nighttime IIHS test using cameras and radar and on RGB image sensors not being able to see in complete darkness (just like me, I should say, but I am still keeping my driver’s license).

Source: Owl.Ai, AutoSens 2023, Detroit

Chuck nicely introduced the four critical phases of “detection” – is something there – “classification” – is it a person, car, or deer – “range estimation” – what distance in meters is the object – and “acting” – warning the driver or acting automatically. I liked Owl.Ai’s slide above, which shows the various sensing methods’ different use cases and flaws. And during the discussion I had during the conference, the OEMs agreed that multiple sensors are needed.

Regarding the transition of driving from L3 to L4 robot taxis, Rivian’s Abdullah Zaidi showed the slide below outlining the different needs for cameras, radars, and lidars, and also the compute requirements.

Source: Rivian, AutoSens 2023, Detroit

No wonder automotive is such an attractive space for semiconductors. Computing, sensing, and data transport requirements are just growing enormously. And mind you that the picture above does not mention other cameras for in-cabin monitoring.

Besides the computing requirements, data transport is core to my day-to-day work. In one of his slides,  Mercedes-Benz AG’s Konstantin Fichtner presented that the DrivePilot system records 33.73 GB of trigger measurements per minute – 281 times as much as it takes to watch a Netflix 4K stream. That’s a lot of data to transport across networks-on-chips (NoCs), between chips and chiplets. And it, of course, raises the question of on-board vs. off-board processing.

Are we there yet? Not quite, but we are getting closer. On the last day of the conference, Carnegie Mellon University’s Prof. Philip Koopman sobered up the audience with his talk “Defining Safety For Shared Human/Computer Driver Responsibility.” His keynote walked the audience through the accountability dilemma when a highly automated vehicle crashes and made some constructive suggestions on updating state laws. In their recently published essay “Winning the Imitation Game: Setting Safety Expectations for Automated Vehicles.” Prof. Koopman and William H. Widen from the University of Miami School of Law suggest that legislatures amend existing laws to create a new legal category of “computer driver” to allow a plaintiff to make a negligence claim.

To make that exact point, the universe created the situation the next day, which I took a picture of below. Can you see what’s wrong here?

Source: Frank Schirrmeister, May 2023

Yep, a pedestrian ghost in the machine.

To technology’s excuse, there had been a jay-walking pedestrian about 30 seconds ago, which probably erred on the side of caution. But still, this was a good reminder that future sensors are hopefully better than my eyes, and a thermal sensor would have helped here too.

All soberness and glitches aside, let’s not forget the end goal: Reducing fatalities due to traffic situations. And as I joked in my last blog on ISO 26262 safety, How Safe is Safe Enough: “If aliens would arrive and assess how to reduce traffic-related deaths, they would most certainly take humans off the streets.”

Brave new autonomous world, here we come. And I am keeping my license. That 1997 Miata doesn’t drive itself!

Leave a Reply

(Note: This name will be displayed publicly)