What Else Can You Do While Driving A Car?

New vehicle designs will include many features determined by the end customer, and the list grows longer as vehicles become more autonomous.


Increasing levels of autonomy in vehicles are driving increased demands for new technology. Consumers care about the electronics in their vehicles, for both safety and convenience, and those features are impacting both purchase decisions and new vehicle designs.

As vehicles grow in sophistication with advanced driver assistance, electrification, or alternative fuel sources, personalized vehicles will become the ultimate consumer devices. And when full autonomy eventually reaches the mainstream, users will care even more about the driving experience because they won’t have to focus on the road.

“When customers drive, they want to be safe, and they want the same driving experience as they would normally, but with a cell phone and computer,” said Susie Gao, senior director of global application marketing and management at Infineon Technologies. “This requires significant changes in the vehicle E/E architecture, from separated ECU systems to a centralized ‘service-oriented’ cockpit system, which adds more high-resolution displays, sensing functions, and premium audio systems. This means OEMs and tiered automotive suppliers must have access to reliable and highly efficient solutions for the cockpit, including human-machine interface sensing, secure wireless SoCs, power supplies, as well as high-performance controllers and memories.”

Driving is now a high-input environment, much like the cockpit of a plane. The difference is that roads have a lot more traffic than the sky, with many more possible and often unpredictable interactions.

“Humans are prone to error,” said Simon Rance, vice president of marketing at Cliosoft. “But with a high-input environment like today’s and tomorrow’s vehicles, safe driving is becoming a priority for automotive systems. This is one of the main reasons why the user experience is driving automotive systems requirements going forward, and it’s happening in two key ways — minimizing driver distraction, and supporting the driver with on-hand information and an enjoyable experience. User experience design is an important factor in automotive systems because it helps avoid undesirable effects, such as driver distraction or loss of control when operating a vehicle in autonomous mode. When designing an autonomous car, designers have to consider safety features that make sure drivers are not easily distracted, or forgetful about driving altogether. To prevent these situations from happening, designers must design systems that encourage focus on driving, even when outside factors like music might distract them from the task at hand.”

How this evolves remains unclear at this point. “In the far future, it will be really nice when everything is fully autonomous, everything is connected, and all the cars have the capability to talk to each other,” said Steven Woo, fellow and distinguished inventor at Rambus. “With that type of experience, there is a lot less to worry about, and the variability of having so many humans involved in every situation goes away. The nearer-term future is difficult because there will be some cars that are going to be autonomous, some cars that aren’t, and there’s a transition period. So things have to be pretty sophisticated.”

Idle time in a moving vehicle
Part of what’s fueling all the complexity around autonomous driving is figuring out what a vehicle’s occupants can do when they’re not actually driving the vehicle. How do consumers want to spend idle time in transit?

“These two halves are what we think about from the performance side,” Woo explained. “How should the autonomous vehicle requirements be dealt with? These demand a lot of bandwidth, and a lot of compute capability. The other part is what do the occupants do, and how do we entertain them? Video is a big one, and we’re already seeing video in cars, in the entertainment units themselves. They’re not small in terms of the requirements, especially memory and the interconnect performance, because you’ve got video feeds and the car is a kind of IoT device. It’s both an endpoint and a hub. There’s connectivity that could become very useful if you get just pervasive connectivity. Whatever movies you might be streaming, or games you might be playing, you might be interacting with people in other cars, and have spontaneously generated networks that form. Here, the question is what people are going to want to do. It’s pretty safe to say people are going to want to do the kinds of things they’re already doing on things like PCs, posting or playing games. Or we’re going to want to get news, and talk to people to pass the time. With some of the requirements we see in devices like phones or computers, we already have a really good idea of what may be required.”

The starting point for this shift is establishing key performance indicators. “For example, one of these is measuring the momentum of the passengers versus the momentum of the vehicle with an inertial measurement,” said David Fritz, senior director for autonomous and ADAS at Siemens Digital Industries Software. “If the delta of the inertial motion between the passengers and the vehicle is too great, that’s not very comfortable. You’ll hit your head on the steering wheel or on a window or the dashboard. Inertial monitoring of the occupants, as well as the vehicle, is an important new thing, and this is starting to be designed into seats whereby the seat is measuring the difference between the mass of you at rest and the mass of you as a vehicle makes a change in its momentum. That shows how much you’re pitching forward, or you’re pressed back or off to the side, and sensors are becoming available to do that. Then, the requirements from the OEMs themselves — which are then fed into the ADAS and the AV decision-making process as one more inputs as the vehicle is deciding to either ignore your actions as a driver, override them or adhere to them — that’s part of what’s getting factored in.”

At the same time, there are cameras being spread throughout the interior of vehicles that are sensitive and cheap, Fritz said. “The value of that is having a better understanding about what the occupants are doing, and we’ve seen that already with driver monitoring systems. If you start to nod off, it’ll know. That’s just simple stuff. But knowing the kids are in the backseat playing Monopoly or something like that, that’s something different. The decision-making process, and how you react to situations, can use those inputs.”

Adding new features isn’t always so straightforward, though. “When we start talking about the design of these intelligent vehicles, there are many different ways to ‘design and implement’ the intelligence,” he said. “Some of them are more appropriate for taking these types of inputs than others. For example, if an OEM has a mathematically rigorous methodology for trying to prove safety, but implementing those mathematics in silicon is incredibly difficult even though it has a predefined set of inputs that it’s working with, then how do they add ‘customer comfort’ into the mix? On the other hand, an AI- and machine learning-trained system can easily adapt to the additional inputs. But then, how do you actually prove they are safe, because they’re not mathematically rigorous? No one has found that balance. You’re going one way or the other. And in the end, we’re going to find out which one adapts the quickest.”

Further, Cliosoft’s Rance noted, to avoid the undesirable effects resulting from autonomous cars, designers also must ensure that their vehicles do not cause any loss of control when operating autonomously. “One way this can be achieved is by designing interfaces that allow drivers to switch back to manual mode without making them completely rely on technology for every small task along the way.”

But tracing those requirements is very complex, and safety is the first priority.

“When you’re doing functional safety, or when you’re doing quality in general, you’re going along that path with different requirements in the architecture,” noted Benoit de Lescure, CTO at Arteris IP. “In design, this is coding, development modeling, EDA unit tests, EDA all the way through integration to QA. You’re going back and forth along this line, such that when you make a change and say you’re not going to test something that you should have tested, that requirement now has to go all the way along the line. You must make sure the design people know about it. ‘The spec has changed, the requirements changed, and we’re no longer going to do that requirement.’ It also happens that the development team may decide on its own not to do something because of the schedule. How does that affect the architecture? Do you still meet the requirements? Do the people on the test team know not to test for it? And because you’re going back and forth, when the QA team is doing the QA testing, are they basing it off the requirements? ‘At the beginning, we said we we’re going to do this. Did we do that?’ The integration team is working off of the product architecture, and unit model subsystem tests, looking at the design spec and asking whether they have accounted for everything?”

So how do you make sure that everything you discover along the way is reflected at every other point?

“That’s the big problem,” de Lescure said. “There are gaps between the different systems used, which may have APIs you can use to dump information into Microsoft Word or FrameMaker, if you make changes in that information. But it’s not necessarily automatically reflected back in your requirements system. As a result, these gaps are usually handled by manual inspection. What that means is meetings. What’s needed is a tool that reflects if a change is made in the architecture document, linking the requirements to what’s going on in that architecture document, which can say at a high level, ‘The requirements changed.’ Then, at the detail level, it can say what particular changes were made to the requirements.”

Understanding user requirements is becoming essential in the automotive space. Technology helps drive brand awareness and helps consumers make informed purchase decisions, according to a consumer study conducted by Cadence.

“One of the things we asked end users is how important these technologies are for their purchase decisions,” said Frank Schirrmeister, senior group director for Solutions & Ecosystem at Cadence. “Sometimes these things actually influenced the purchase decision. There are three elements of importance when it comes to user experience. First, in general, but in automotive specifically, there are proactive features the automotive manufacturer puts in, which are selling features. Second, there are the indirect features that are not as obvious to the user. They’re deliberately hidden, but the user might not be aware of them because they’re not in your face. The third element, which probably revolves a lot around data, is the notion of what happens if a user realizes what happens and is unhappy.”

Fig. 1: Hierarchy of technology valued by consumers. Source: Cadence

Fig. 1: Hierarchy of technology valued by consumers. Source: Cadence

The ultimate purchase decision comes down to confidence, cooperation and convenience, Schirrmeister said. “Does a feature like a heads-up display clearly give me more security because I’m not distracted looking at everything? Is that really worth the additional investment? Some of the features are in your face, others you realize over time. There are others that are hidden and you don’t realize they’re there. We use our car for one thing, and then get an email at the end of the month with a report on our trips, driving habits, and where our vehicle is parked. It is for good intentions, like being able to share this data with the dealer so I can be informed when something needs to be repaired. But when it comes to insurance, do I really want the insurance company to know that I drove through a stoplight? No. These are the things that potentially can have a post-realization experience, and if you don’t think about them carefully, such as what happens with the data, then you might have an adverse reaction to it afterwards.”

As cars become more connected, they are becoming more like sophisticated consumer electronics “This means cars require a much more advanced user experience,” said ClioSoft’s Rance. “The way we use our cars is changing, and it’s changed by the way we use our smartphones. You can think of these developments as an evolution from task-focused to goal-focused design. User experience is driving the systems requirements to ensure that vehicles become easier and more intuitive to use, while the systems become more complex and advanced. This, in turn, will drive collaboration and systems requirements between automotive system designers and big tech. Automotive system designers know how to design automotive systems, while big tech knows how to design the user experience for technology visually (screen, HUD), by voice, and by touch.”

At the same time, all of these features require more security. “The more connectivity you have, the more streams you have coming in, and especially the more diverse sets of things you have to rely on coming in and out of the car, the greater the security requirements,” said Woo. “And the fact that it’s a software-defined vehicle, where you can do software over-the-air updates and things like that, further increases the need for security.”

Leave a Reply

(Note: This name will be displayed publicly)