Big Changes Ahead For Inside Auto Cabins

New electronics to monitor driver awareness, reduce road noise, ensure no babies or pets are left in hot cars.

popularity

The space we occupy inside our vehicles is poised to change from mere enclosure to participant in the driving experience. Whether for safety or for comfort, a wide range of sensors are likely to appear that will monitor the “contents” of the vehicle. The overall approach is referred to as an in-cabin monitoring system (ICMS), but the specific applications vary widely.

“In-cabin sensing is coming online a lot, especially with the Hot Car Act that’s going through the Senate and other NCAP regulations in Europe and Japan,” said Vikram Patel, director of product marketing at Infineon.

Much of the attention will be on the driver. “Market activities are driving things like the ability to measure heart rate, driver focus, and driver acuity,” said Chris Clark, solutions architect for automotive software and security at Synopsys.

Other sensors will assess the status of any passengers. Together, they should help to avoid catastrophic situations caused by the driver or other riders. But a fine line must be drawn between useful interventions and a sense of creepy surveillance. And, as always, security is a critical consideration.


Fig. 1: A variety of sensors will monitor the interior of the vehicle, providing safety and comfort features. Source: Infineon

Monitoring the driver
There are many situations and poor practices that hamper today’s drivers, and they result in accidents that would not have happened otherwise. So, many of the new sensors are focusing on driver attention in the form of a driver-monitoring system (DMS).

DMSes aren’t necessarily new in high-end cars, but they’ve been an add-on option. “When you look at the high-end Cadillac, they’ve had a drowsy-driver monitoring system for four years,” said Tom Wong, director of marketing, design IP at Cadence. “And they’ve had an eyeball-tracking camera in the car for four years, except that it’s an option and they’re not publicly promoting it.”

Regulations are making this an important feature. “In the next update of the EU New-Car Assessment Program (NCAP) safety rating system, DMSes are called out as one of the ‘primary safety’ features that will be required to get the cherished five-star safety rating,” said Robert Day, director of automotive partnerships, automotive and IoT Line of Business at Arm.

Government mandates are likely to be accompanied by insurance-company policies to reduce the cost of accidents. “That would lower your insurance premiums,” observed David Fritz, senior director for autonomous and ADAS at Siemens EDA.

Driving today isn’t difficult, but over the course of a long, monotonous drive, it can be hard to maintain attention. Semi-autonomous driving is even harder. The “driver” is expected to be attentive while not even doing the driving – something the human mind has a hard time doing.

Until full autonomy under all circumstances on all roads is a reality — something that will take years and may never be 100% — driver distractions are even more likely to be an issue than they are today.

For this reason, various sensors will work to ensure the driver is fully engaged in the process of watching the car drive so that they can take control when needed. “The car senses when I’m not touching the steering wheel,” noted Frank Schirrmeister, senior group director, solutions and ecosystem at Cadence. “It recognizes that you are no longer engaged.”

Reasons for moving back into manual drive include dealing with unexpected events and exiting autonomous mode, when the task is handed back to the human. If the human isn’t ready to accept the handoff, then some fallback action – like pulling over to the side of the road – will be necessary.

The most obvious thing to look for is any sign that the driver is becoming drowsy. This can be done by cameras watching the driver’s eyes, with AI deciding when the driver appears at risk of drifting off.

One measure of driver drowsiness is the amount of time the driver spends blinking. “There’s one metric called ‘percent closed’ that measures the percent of the time your eyes are open and closed,” said Amol Borkar, director of product management and marketing, Tensilica vision and AI DSPs at Cadence. “They have a threshold that says, ‘If the time open is less than 40% or 30%, you’re impaired.”

This has been broadened for a more holistic assessment. “They’ve gone beyond just looking at the eye blinking,” Borkar said. “They’re looking at face orientation, probably emotions, facial temperature, and gaze direction, coming up with a more comprehensive evaluation of the driver’s state.”

More subtle is the ability to detect whether the driver appears to have disengaged with the driving task. This can be done by monitoring the driver’s pupils to determine where their attention is focused.

But today’s interiors require that attention move from the front to the mirrors and the dashboard – at least while the driver is in control. Attention occasionally goes to the center stack for changing the radio station or the volume. Turns must be preceded by glances in the direction of the turn to ensure a clear path. So algorithms will need to account for normal eye movements.

Impairment is another important factor to look for. While cameras may do the main work here, audio can help. “You can use microphones to check whether the voice is more slurred today than it was yesterday or in the morning,” said Prakash Madhvapathy, product marketing director for Tensilica audio/voice DSPs at Cadence. “Then, chances are that they’re impaired due to alcohol or something else.”

Other ways of assessing the driver’s state of mind or health include looking at their posture and monitoring their heart rate. If those go too far out of normal, it may indicate a health problem.

Meanwhile, driver comfort can be assisted through facial recognition that sets up the vehicle for a known driver. “Nowadays, you can save profiles for seat adjustment, mirror positions, and your favorite radio station,” said Borkar. “But rather than pressing a button, the camera gets a good glimpse of your face, recognizes you, and authenticates you.”

These sensors also can help with hands-free controls, but how that will work is still an open question for the long term.

Watching passengers
The status of passengers doesn’t have the same impact on safety as the status of the driver. But there is one big exception to this — the presence of people or animals that can’t take care of themselves, including children, pets, or anyone with mobility or other limitations. This becomes the role of the occupant-monitoring system (OMS).

One of the most heartbreaking things that can occur in a car is the death of a neglected infant when the temperature rises too high. Similar issues occur with pets. Cracking the window open can be insufficient on a hot summer day.

A new U.S. law, called the Hot Cars Act, is in Congress now. It would require a child safety alert system in new cars. The situation is similar in the EU. “From 2022, NCAP is also awarding vehicles for having an OMS function that detects children left alone in a vehicle,” said Arm’s Day.

From an implementation standpoint this would require two things — detecting that someone was still in the vehicle after the driver departed, and detecting that the temperature has exceeded a safe level.

The details of the requirements are evolving. “They’re still deciding what the OEMs have to do, whether it’s ‘detect,’ ‘detect and notify,’ or ‘detect and make safe and notify,’” said Infineon’s Patel. “The carmakers are exploring technologies. For example, if the radar detects a heartbeat in the back seat and the pressure sensor in the front driver seat says, ‘There’s nobody sitting in the front seat,’ you can call the owner, the OEM, or the cops, and then lower the window or turn on the AC if there’s enough gas or battery life left in the car.”

Children have been known to wriggle out of poorly installed seats, and adults in a hurry may neglect to strap the child in or may do so poorly. This means that a comprehensive approach is needed to detect someone in the back seats of the vehicle or even behind a seat on the floor.

“If a baby is not in view of the camera and is sleeping, then if you leave the baby in there, radar can detect that there is a heartbeat in the car and warn you immediately when you lock the car,” said Ted Chua, product marketing director in the Tensilica IP group at Cadence.

While a DMS might not be needed with full autonomy, OMSes are more likely to stick around. “OMS systems also will play a part in fully autonomous vehicles where there is no driver, but the passengers could be monitored for behavior such as vandalism or opening the door when the vehicle is moving, or for potential medical conditions,” said Day.

Day summarized a wide range of other OMS applications. “Passenger anger/sadness or unruly behavior could lead to driver distraction,” he said. “And there’s smart airbag and seatbelt detection; seat, HVAC, and gesture controls; and passenger identification.”

Another innovation in sensors involves shared cars. noted. “The sensors can tell if the car was left clean (or not), if there is an altercation ongoing or damage inside the vehicle, or even if someone left their wallet, smart phone, or charging cord,” said Walter Wottreng, vice president of automotive business development at Synopsys. “And it can notify them to go back to the car to get it.”

Cameras, radar, but not lidar
Opinions as to which sensors will be used for which applications vary widely. Considerations include redundancy, where different sensors compensate for each other’s weaknesses, especially with respect to corner cases. Cost is also a primary factor in balancing which sensors to include.

Cameras appear to predominate based both on cost and on the relatively advanced state of artificial intelligence (AI) for vision applications. “The technology behind each camera includes processing of AI algorithms, e.g., eye tracking, facial detection, object detection, and so on,” said Wottreng.

But the type and positioning of cameras may vary. For instance, a driver-monitoring camera located in the steering column or instrument panel might be a narrow field-of-view (FoV) model, while an occupant-monitoring camera located either in the center stack, within the rear-view mirror, or in the dome light might be a wide-FoV version.

“If you just want front-sensing, everybody’s very comfortable with a single front camera, centrally located either in the dash or in the rearview mirror,” said Willard Tu, senior director of automotive at Xilinx.

Radar, meanwhile, is seen by some as complementing the functions of the camera. While multiple cameras may be used, a single radar unit can penetrate all of the interior. “Radar has the ability to see around things a little bit more, so that’s why they’re talking about short-range radar in the cabin,” said Tu.

Some are relying on radar more than others. “It can detect how many passengers are in the car, whether there’s an infant on the backseat, the posture of the driver, their heart rate, and so forth,” said Robert Schweiger, director of automotive solutions at Cadence.

It also can help to locate things in the vehicle, be they people, pets, or belongings. “Imaging radar could be fairly low resolution, but still give a little bit more because 2D is not good enough. It needs to construct a little bit of the shape of objects in the vehicle.”

A single set of radar returns can be analyzed by different algorithms to provide these different applications.

While lidar may have a role for viewing the exterior of the car, there appears to be little appetite for it on the inside. It’s currently very expensive, and it doesn’t serve any killer application itself. Unlike with external use, it’s not perceived as adding any value to what cameras and radar can already do.

In addition, there’s a safety question. When lidar is used outside the car, it strikes objects – and pedestrians – briefly in passing. That’s not the case inside the cabin. “If you are in the cabin, the human has not much space to move around,” said Chua. “So is the laser safe?”

There may be a role for lidar in cargo vehicles, however, which is an application that may have a better return on investment. “You could put a lidar sensor in the top of the cargo area, where you can scan and monitor and ask, ‘Is all this cargo still in the right place?’” noted Chua.

Audio can play a role
Microphones may be employed for a number of reasons. The one most often mentioned is for detecting the cry of a baby isolated in a heating car. In this role, however, it would be in addition to other sensors, because there’s no guarantee a baby in distress would cry out.

Road-noise cancellation is another possible application. This is more complex than it seems, since internal conversations and music, as well as important external sounds like sirens, must not be canceled.

“The wind sound that makes it into the cabin has to be canceled somehow,” said Madhvapathy. “For that purpose, they sprinkle microphones inside the cabin. They mount accelerators on the chassis and the drivetrain to capture the vibration. They correlate the motion sensed by the accelerometers with the audio signal.”

There are also important sounds that originate outside the vehicle. “There are microphones inside that will be listening to external sounds, as well,” Madhvapathy said. “Not just sirens, but horns, a crash, or glass breaking.”

In theory, subtracting the vibrations from the sound should be sufficient to preserve external sounds. But identifying them provides a redundant way of emphasizing important sounds using AI. “If you try to use traditional DSP techniques, that takes a lot of programming effort,” noted Madhvapathy. “A lot of these algorithms are going the AI way.”

Microphones could be used in an always-on fashion. “Microphones don’t require a lot of energy to keep on all the time,” noted Tu. “We see that with our phones already.”

Other sensors, other functions
Finally, while temperature sensors already exist in cars, pressure sensors in the seats can determine the presence and weight of an occupant, helping to moderate (or cancel) the triggering of individual airbags in a crash.

Another application comes into play when an autonomous vehicle must make quick evasive maneuvers that could involve high G forces.

“If we have an inertial sensor and we know the mass of the person, then we can do things like loosen or tighten seatbelts and soften or change the angle of the seat to make some of these high-performance maneuvers not only survivable, but actually comfortable,” said Fritz.

The exact configuration of sensors is expected to vary widely, and higher-end vehicles would be expected to have more than low-end versions. Exactly which sensors are best for which applications and how much sensor overlap is needed for redundancy are not settled issues.

Costs will favor the use of cameras, which are relatively inexpensive. “Right now, the primary focus will be a single camera, plus your temperature, plus your microphone as a starting point,” said Tu. “And then you either add another camera or radar.”

All sensors have certain areas that they do well – and perhaps only they can do. “But all those sensors collaborating together will give better features or service or functions to the consumer,” Chua said.

The addition of other systems will require balancing the need for better performance against the cost of the additional sensors – both in hardware and, more importantly, in the software needed to fuse and decode all of the sensor data. Some see a rich combination of sensors as the way forward.

“You don’t have to be dependent on one sensor,” added Tu. “That’s where sensor fusion is the way of the future.”

But some foresee a more minimal future to ameliorate costs. “The OEMs are totally maniacal about costs,” said Fritz. “Radar isn’t going to do anything that a cheap little $2 camera can’t do. That’s the position we’re hearing from OEMs in Germany, North America, Japan, China, and Korea.”

As to where the cabin sensor fusion will take place, that’s not entirely clear at this point. “The computing typically utilizes application processors and a combination of machine learning, GPUs, and image signal processing,” said Day. “These are usually found in a standalone SoC dedicated to the DMS function, but there is also the thought that this functionality could be linked with the ADAS/autonomous central compute as a software workload taking feeds from the internal camera system.”

The cabin processing won’t necessarily be light. “The chips and systems doing sensor fusion will need pretty fast throughput to handle data streams,” observed Wottreng.

This can raise a question similar to that affecting external sensors, regarding whether to use high-level, feature-level, or low-level fusion. “In high-level fusion, the sensor and its associated processing unit do most of the tracking work, sending the object information to the central processing unit,” said Thierry Kouthon, technical product manager for security at Rambus. “With low-level fusion, the sensors send the raw data to the central processing unit that will do all the tracking and object identification. Feature-level fusion is an intermediate stage, whereby the sensor will perform some tracking work and send features to the central processing unit.”

Security considerations
These types of sensor data flows add both privacy and security concerns. With respect to outside attacks, sensor data sent out as a part of the telemetry stream could be intercepted as a means of “bugging” the driver or some other occupant.

“Let’s say it’s an organized crime or a nation-state activity that’s trying to monitor information about an individual,” said Synopsys’ Clark. “Can I utilize that camera to see what that person is saying by using facial recognition and some type of software that tells me what words are being said?”

This becomes more difficult if raw video isn’t transmitted, and it’s likely that bandwidth would not be used for that. But the bigger problem to be solved is that hackers will try to find a way to use or abuse the systems in ways never conceived during when the vehicle features were designed.

“Does that camera create a security vulnerability that didn’t exist at the design phase of that vehicle?” Clark asked.

To avoid that, any communications would need to be authenticated and encrypted, regardless of how much it might feel like overkill for some piece of innocuous-seeming data. Today there may be a few communication channels to and from the vehicle that need protection, but those are likely to be consolidated into one channel in the future. “We use the term ‘over-the-air updates,’” said Clark. “But it’s going to be the primary transport for any data in the vehicle.”

Within the vehicle itself, a device could be attached to the chassis to acquire data through side-channel or network snooping. This is possible where unsecured CAN buses are in use, but if Automotive Ethernet is used throughout, security should make that a more difficult proposition.

Meanwhile, vehicles will increasingly be provisioned with the hardware necessary to implement a wide array of optional features. The OEMs can then sell or rent those features, using software to enable or disable them.

The threat from the driver is that they will find a way to hack into the system to enable features for which they have not paid. Aftermarket tools may play a particularly pernicious role here, with experts learning where the security vulnerabilities are so that they can monetize them with vehicle owners.

“The aftermarket is notorious for figuring out how manufacturers have implemented their security and finding ways around that in order to provide a service that consumers are looking for,” said Clark.

Conclusion
The fact that decisions are still being made on which sensors to use for what indicates that it will be some time before they hit the road. The designs for vehicles intended for sale in the next few years are already complete.

For systems like the EU-required DMS systems, the mandates require availability in the next few years. But implementations are expected to be minimal, meeting the needs of the regulations, but not much more. “You’ll start seeing a lot of this stuff in 2025,” Fritz predicts.

At that point, the driving experience may change dramatically. Consumer response will be interesting to watch. Will they see this as making their travel safer and more comfortable? Or will they resist what could feel like micromanagement of a task that prior generations had performed with much greater freedom?

That will depend both on the details of the implementations and on how much say the driver has in how – or whether – they work.

Related
Innovations In Sensor Technology
Better management of data, increased accuracy, and lower power approaches top the list.
Competing Auto Sensor Fusion Approaches
Cost, data volume, and complexity drive multiple solutions.
Changes In Sensors And DSPs
The impact of specialization and combination.
Auto Displays: Bigger, Brighter, More Numerous
Chips come under new scrutiny as screens become integrated into safety-critical systems.
Sensor Fusion Challenges In Cars
As more pieces of the autonomous vehicle puzzle come into view, the enormity of the challenge grows.



Leave a Reply


(Note: This name will be displayed publicly)