Knowledge Center
Navigation
Knowledge Center

Autonomous Vehicles

popularity

Description

SAE International and the International Organization for Standardization (ISO) define six levels of autonomous driving, from 0 to 5. The key difference between L3 and L4 is that L3 is conditional driving automation. While it provides many autonomous functions, the driver is expected to take over when necessary. At L4, automation provides driving capabilities with no expectation of human intervention, so moving from L3 to L4 is a major jump. And while it is tempting to modify or add to existing L3 designs, designing L4 vehicles from the ground up is better. OEMs are looking at convenience and comfort with L4, but they also want to achieve safe autonomous driving.

As the industry gears up for Level 4 vehicles, autonomous driving will likely come in different stages of L3+. More autonomous features are being added into high-end vehicles, but getting to full autonomy will likely take years more effort, a slew of new technologies — some of which are not in use today, and some of which involve infrastructure outside the vehicle — along with sufficient volume to bring the cost of these combined capabilities down to an affordable price point.

In the meantime, many of the features that will be included in autonomous vehicles are likely to be rolled out individually.

For L4/L5 vehicles, the advanced driver assistance system (ADAS) is the key autonomous driving technology, and as autonomous vehicles move from L3 to L4/L5, ADAS design system complexity will increase significantly. For example, ADAS designs will need to see what is in front of them and to understand what they are seeing in the context of their environments.

The ultimate objective of autonomous driving is to use the ADAS to perform all the necessary functions normally done by human drivers. Some of these functions include driving directions using GPS, object detection including pedestrians and other obstacles, safe lane changing, traffic sign and vehicle recognition, acceleration and deceleration (adaptive cruise control) with automatic emergency braking, and automatic turning with signals.

Other challenges and considerations in designing L4 include software, simulation, new technologies, and tests. Software remains a critical factor in L4 and L5. But as it exceeds multiple millions of lines of code, it also requires robust processes for software development, test, verification, and updates. Simulation will continue to be an important area because it is not practical to road test all the different versions of L4 designs. For the most part, simulation also relies on software, although test models including AI, digital twins, and others still need to be developed.

While it is beneficial to add technologies and new improvements to the existing L3 design as a pathway to L4, ultimately it will require a completely new architecture to achieve true L4. The ADAS system needs to be multiple times more powerful than L3’s. Additionally, new sensors need to be installed to increase a vehicle’s understanding of the environment. It takes more than just detecting what is in front and applying emergency braking when necessary. L4 needs to know more than that. It needs to know what to do if there’s a roadblock, accident, detour, and the like.

Because the requirements for each level are determined by operational capabilities, there is no textbook definition or specification on the number of required sensors. Additionally, advanced testing will be required to ensure L4 is functional and safe.

Many factors impact day-to-day driving, but there are four main AV design challenges:

  • Driving conditions. Weather, road, and traffic conditions directly impact autonomous driving. Making the right decisions to respond requires a faultless integration of sensors, ADAS, and AI software.
  • Limitations of available autonomous technologies. Even though AV technologies have made great improvements over the years, they still contain flaws and vulnerabilities. The safety factor plays a very important role here, and OEMs are facing constant challenges to achieve higher reliability and safety. Additionally, AV technologies will need enormous compute power.
  • Regulatory and legal implications. Human drivers employ common sense. For example, when a human driver sees a police officer at the center of an intersection waving their hands to direct traffic, the automatic reaction is to slow down and follow instructions. Likewise, when driving on a narrow road under construction or repair, humans reduce the speed of the vehicle when observing a “slow” sign held by a worker. Will AVs be able to make a distinction between a police officer’s hand signals and a pedestrian crossing the road while waving to a friend?
  • Lack of smart infrastructure, including V2X. Perhaps the most difficult aspect of L4 driving is the lack of smart infrastructure. For L4 vehicles to operate safely and reliably requires fully functional L4 design working harmoniously with the external environment. It is not enough to design a very smart L4 vehicle that functions faultlessly. To avoid crashes, L4 AVs need the support of smart infrastructure and V2X.

New technologies, including ADAS, will continue to develop and improve while new sensors and vehicle architectures will evolve. More tests are needed to ultimately achieve autonomous vehicle safety and reliability, including getting Application Safety Integrity Level (ASIL)-D and ISO 26262 certified.

The ISO 26262 standard focuses on safety under a variety of conditions — extreme temperatures, unexpected vibration, or a collision that is unavoidable. This includes everything from drones to aerospace and robotics, where increasing levels of autonomy can easily transform a moving object into a safety hazard. ISO 26262 acts as something of a blueprint for best practices, including assessing what can go wrong and how to either fix it, or at least ensure that an autonomous machine fails gracefully, without injuring anyone or causing unexpected damage. As the automotive industry gradually moves toward full autonomy, silicon also plays an increasingly vital role, factoring into everything from infotainment, braking, and guidance.

AI presents another challenge. Nearly every new vehicle sold uses AI to make some decisions, but so far there is no consistency in what is being developed, where it is being used, and whether it is compatible with other vehicles on the road. While carmakers typically adhere to standards such as ISO 26262, ASIL A-D, and AEC-Q100, there is a lot of technology that falls outside of those standards. And because AI is being used in many applications within the car, there will be different AI algorithms and AI graphs used depending on the specific application. To bring this new technology to market, algorithms need to be trained in a vehicle under real workloads, which can vary greatly depending on the type of inferencing chips or accelerators.

ADAS relies on sensors to collect data, on AI algorithms, and on ECUs to process the data and to activate the EBS when necessary. If everything works as planned, ADAS should do a much better job than human drivers without accidents. Two things need to happen to prevent autonomous vehicles from crashing. One is detection and determination of hard objects on a collision course with the AV. The second is real-time EBS activation.

Since the AI algorithm encompasses a number of variables — including the calculation of the distance and speed of the moving vehicles/objects in front of the AV — if the object is moving, the algorithm must determine how time will elapse before the AV collides with the object if the AV does not slow down. It also needs to consider weather and road conditions, and whether the objects in front are moving vehicles, pedestrians, animals, or other random moving objects.

Digital twins can be used to add confidence that the physical vehicle will perform as the digital twin does in situations that are too difficult or too dangerous for physical vehicles. Another way to increase safety is with AI-based cameras. Deploying AI used to be expensive and consume a great deal of power. That’s not necessarily the case for inference chips, particularly once the algorithms are trained.

Another major hurdle for fully autonomous driving is 5G network speed limitation. 5G has been touted as the superfast network meant to enable almost instant information transfer between vehicles and networks, with data traveling at the theoretical speed of 10Gb per second. This would enable autonomous driving. Today, however, the best 5G speed in North America is only a few hundred Mbps, a small fraction of what it should be. There are many reasons for this, including the requirements of short-range 5G towers needed to enable a line-of-sight communication. But 5G infrastructure investment is not a trivial matter.

A final challenge is mapping accuracy. Autonomous cars find the way to their destination using a number of critical technologies, including some version of a global position system and a central brain to interpret that and other data. But many of those technologies are not reliable or accurate enough today, and may not be for years to come.

While most people are familiar with global positioning systems (GPS), these are subsets of global navigation satellite systems (GNSS), which include many other mapping and surveying applications. Many OEMs develop or acquire mapping technologies to support ADAS. Technologies used for localization in AV include HD mapping, 3-D mapping, multilayered digital maps, and real-time mapping solutions. Today, most of the OEMs are investing in autonomous driving and associated mapping technologies. This will be essential in the longer term, but it also is a competitive selling point in the near term.

But perhaps the biggest challenge for OEMs is integrating all of these technologies, both initially and over the lifetime of a vehicle. Software-defined vehicles, centralization of ECUs, mastering wireless technologies, and working with various sensors, will remain enormously complex and challenging. And while all of these new features can improve safety and convenience, there is still a long way to go.

Traditional OEMs recognize the need to adapt quickly to remain competitive. Some are creating separate divisions for autonomous solutions to facilitate faster implementations of new technologies without being hindered by existing organizational structures.

 

Multimedia

Testing Autonomous And Semi-Autonomous Vehicles