Faster Commoditization In Cars

As automakers rev up autonomous driving, cost cutting needs to happen at an unprecedented pace and scale—even for new technologies.

popularity

Sensors are at the heart of assisted and autonomous driving, but even before these devices hit the road the average selling prices of these components will have to fall far enough to be affordable to a mass audience.

Achieving economies of scale is what has made the semiconductor industry successful over the past half century. It has enabled semiconductors to proliferate and for electronics to progress. But typically this march toward commoditization happens after devices have been in the market for awhile, when processes and flows are mature enough to sustain that kind of cost cutting. In the automotive industry, prices need to drop before some of this new technology hits the market.

Light detection and ranging, aka LiDAR, is a case in point, and it is emerging as something of a poster child for this effort to slash costs. This remote sensing technology measures distance by illuminating a target with a laser light source, and until now it has involved multiple chips and technologies. But as the use of LiDAR in the automotive industry gains momentum, there is a growing effort to reduce the cost while also increasing the range.

This effort is similar to what has happened in other technology areas, such as PCs and smart phones, where multi-chip solutions were integrated onto a single die in order to cut costs. But all of that occurred after supply chains, methodologies, flows and processes were reasonably mature. This is all being accelerated in automotive as big automakers race toward autonomous driving.

To achieve their goals, automakers are looking to cut costs everywhere, and efforts to turn LiDAR into an integrated, inexpensive solid-state solution is at the front of the pack.

“Solid-state LiDAR helps in this regards by providing an embeddable sensor at a very low cost,” said Ranjit Adhikary, vice president of marketing at ClioSoft. “But along with LiDAR there are a lot of associated sensors needed for self driving cars such as pedal pressure sensors, torque sensors, accelerometers, seat occupancy sensors, rotational speed sensors, which aid in a number of applications such as pedestrian detection, air intake humidity measurement, night vision systems, and hands free calling.”

This is like commoditization on steroids. It requires coordination of efforts across an ecosystem on a scale not seen before with any technology development.

“As with any new technology, there are always a number of fast-moving changes and innovations as each company strives to be ahead in the race,” Adhikary said. “From the standpoint of a company providing the various MEMS and sensors to the automotive industry, it becomes important to better manage the associated data for each component along with their numerous versions, the associated design data, PDKs and tracking the usage.”

The challenge is aligning aggressive rollout schedules with cost-cutting. According to multiple reports, both Ford and Toyota said they may skip level three of SAE’s five levels of automation and move directly to high automation and full automation.


Fig. 1: Automation levels. Source: SAE

“Though the costs involved in deploying LiDAR in vehicles are still somewhat prohibitive, price points will continue to fall over the coming years,” said Donna Yasay, vice president of worldwide business development at Marvell. “As we move from ADAS technology through to semi-autonomous and then eventually fully-autonomous driving, an array of new, higher-performance, next-generation sensor mechanisms will need to be employed. Conventional optoelectronics (such as IR or visible light) won’t be able to deliver the high degree of detail needed to ensure that objects are always correctly identified even when accompanied by the most sophisticated of image recognition software. The rendering of 3D images using LiDAR will prove to be much more effective, providing a more comprehensive and constantly updated representation of the environment around the vehicle with greater accuracy and extended range.”

Accelerating schedules everywhere
This isn’t as simple as just playing one semiconductor vendor off against another or reducing the number of chips, though. Unlike in the past, when electronic control units were largely independent systems, an autonomous vehicle is a complicated network of integrated systems. That requires a high-level architecture, which must be modified continually to adjust to changes throughout the rest of the system, particularly around the flow of data collected by sensors and processed both locally and centrally.

“If you break it down, LiDAR is a sensing system,” said Jeff Hutton, senior director of the automotive business unit at Synopsys. “So you need intelligence for the sensing part and you need activation. But the question is whether you take in all the sensor data and process it for LiDAR, or whether you feed all of that data into a master fusion system and process it in some main processor in the car.”

So far, there is no consensus on what is the best approach for each type of sensor. “There are a lot of sensors in a car,” Hutton said. “There’s a lot of attention being paid to LiDAR, radar and other vision sensors, but there also are sensors for speed, temperature inside and out, and many other things. A main processor is a more efficient approach, but the question then is how you distribute everything. You don’t just need to sense it. You also need to send information back out about to brake, steer, or turn down the volume. The integration of electronics is significant compared to cars today.”

Further, where that processing gets done isn’t clear. “It is generally accepted that automated driving requires huge computing capacities. What remains subject to discussion, however, is which processor architecture is best suited to run the object identification and sensor fusion algorithms that enable computers to drive vehicles,” said Thomas Wong, design engineering director at Cadence.

From a system implementation standpoint, regardless of whether it is radar, laser, camera and ultrasonic sensors, there is a fair amount of data processing needed for the respective sensing, analyzing and interpretive systems, he said. “We are not talking about using server farms in the cloud or heavy duty GPUs (supercomputer in a car), but rather embedded, low-power DSP processors that will do these processing in real time in an integrated SoC.”

Along with that, cache coherency and multiple fast interconnects are mandatory. The reason is the need for memory space in combination with multiple multi-core clusters, which are essential to process specific algorithms that are used to build a shape of the physical environment in front of the LiDAR-sensor, noted Juergen Jagst, senior segment manager, automotive at Arm. “Additional groups of dedicated accelerator units will help to cope with the enormous data stream, which represents the ever-changing objects in front of the moving vehicle.”

The technology challenge
How to best achieve that with LiDAR isn’t clear yet. There are several different approaches, each of which requires integration of varying levels of software and silicon performance, according to Amin Kashi, director of ADAS and autonomous driving at Mentor, a Siemens business.

“Scanning LiDAR technology must support the processing of about 1 gigabyte of data per second from the LiDAR front end, and therefore high-performance FPGAs have been the most common choice for back-end intelligence in these type of sensors,” Kashi said. “On the other hand, 3D Flash LiDAR technology receivers are usually CMOS sensors, which are capable of handling much of the processing load on the front-end. Hence, with 3DFlash LiDAR, back-end processing is can be handled using far simpler CPUs.”

Moving forward, he expects to see more autonomous vehicle designs based on innovative, centralized processing approaches whereby LiDAR data is captured in raw data form. “These approaches minimize the need for integration of front-end processing. In centralized systems, the raw data is transported in real time to a centralized processing platform, thereby reducing the cost and size of front-end sensors while enabling optimal, real-time situational awareness and faster automated driving performance,” he said.

Achieving this at what automakers consider an acceptable price point won’t be easy, however, and it will require commoditization that stretches well beyond technologies such as LiDAR.

“ECUs within the vehicle will need to be able to deal with the capturing, processing and subsequent reacting to the received imaging data by initiating braking or carrying out an evasive maneuver without latency issues arising,” said Marvell’s Yasay. “And depending on the resolution involved, LiDAR systems could be generating over a million data points with each scan they make. Although in-vehicle networking has been established for a long time, it has never had to cope with such huge quantities of data before. Furthermore, existing protocols such as CAN, FlexRay, among others, don’t have provisions in place to scale up with the data volumes now being expected.”

That could make automotive Ethernet, which is considered too expensive for all but the high-end luxury cars, an essential technology. “Deployment of Gigabit Ethernet in cars is already starting to get underway, and multi-Gigabit Ethernet will follow on from this in the near future,” she said. “In the near future, only Ethernet PHY and switches that support 100/1000Mbps or higher link will be used in between LIDAR and ADAS modules to transfer the data. Since the data transmitted is mission critical, these components will also need to meet the Automotive Safety Integrity Level-B or higher to ensure only ‘healthy data’ is getting through.”

The LiDAR market
LiDAR is used to enhance a vehicle’s navigation capabilities by detection and avoidance of obstacles by providing clear-cut 3D snapshots of every object in the vehicle’s vicinity, although unlike other types of sensors or telematics currently available, LiDAR has improved capabilities when it comes to the detection of objects, even in cases where there is a complete absence of light.

Markets & Markets has forecasted the global automotive LiDAR sensor business to be worth more than $735 million by 2025, which is more than 10X larger than today, as automobiles migrate toward autonomous driving. Leading vendors in automotive LiDAR include Continental, Leddar Tech, Quanergy, Velodyne, Bosch, Novariant, Denso, Hella, First Sensor, Teledyne Optech and Valeo.

Still, where LiDAR ends and other technologies begin isn’t obvious, and direct correlations between LiDAR and ECUs are a little hard to describe because they may be seen as far removed from each other.

“LiDARs are essentially sensors that gather data from the environment,” said Kumar Venkatramani, vice president of business development at Silexica. “ECUs on the other hand execute a specific function and respond to a specific command—for example, power-steering ECUs or brakes ECUs. What should lie between the two is a control box of some sort, which reads the ‘data’ sent by the sensors, interprets the data, understand what is likely going on and then sends specific actions/directives for the ECUs to execute. LiDARs are just one of the sensors that sends data to the control box. As LiDAR technology improves, the ‘data’ sensed by the LiDAR is much more sophisticated and better. While previous generation LiDARs might specify than object is at an angle of 20 degrees, 30 meters ahead, a more sophisticated LiDAR might specify that there is a pedestrian entering a cross walk 31.257 meters from the left. However, the LiDAR itself will not dictate what action should be taken. All it can tell you, in the best case, is that there is a pedestrian in a cross-walk. Whether the car should slow down to let the pedestrian pass, or speed up to get out of the way, will require additional information, including but not restricted to weather conditions on the road, other vehicles in front or behind, the current speed of the vehicle, the conditions of the brakes, other pedestrians, known speed-limits on the road, how close the car is to the cross walk, existence of flashing yellow-lights, or construction zones, or even traffic lights themselves.”

He said that expecting a single-function system to determine whether to apply the brakes hard, soft, immediately, or whether to accelerate or steer right or left is not reasonable. That will require a higher-order ECU.

In addition, the control box has to be able to correlate information from multiple sensors to make sense of the complete pictures. For example, the LiDAR might provide input that a pedestrian is in a crosswalk. A camera might pick up a bouncing ball and a child about to retrieve the ball and head back to the curb in the same cross-walk. Correlating the two pieces of information to identify the pedestrian in the crosswalk is the same child as the shown in the camera is left up to the control box.

“The smarts that have to be incorporated into a control box to take all of this information and formulate a cohesive set of actions requires a lot of computing horsepower,” Venkatramani said. “These computations have to understand the data, make sense of multiple data, and still decide the action in less than a second. The answer usually is not ‘throw as much horsepower as you can onto the control box,’ because that might mean the autonomous car will only go for five miles before it runs out of battery.”

Balancing all these parameters — battery life, compute horsepower and the amount of time available to make that decision — is a multi parameters optimization problem, and tools that can help simulate possible scenarios and anticipate what needs to be done in how much time could help design these sophisticated systems better. “System OEMs and Tier-1 operators are building sophisticated algorithms to compute these results. Semiconductor companies are building hardware systems that can compute results fast.  But the exercise of figuring out how well the specific algorithm will work with a specific set of hardware under the specific conditions must be done by the System OEMs and Tier-1 and the iterative loop time to ensure this all works is very time consuming,” he added.

It also makes it very difficult to figure out how to accelerate cost-cutting, which is an overriding theme behind making autonomous vehicles affordable as well as safe.

Conclusion
Accelerating commoditization will be required to bring autonomous vehicles to market in a way that is safe, affordable and reliable enough to make this technology shift viable.

This may prove easier said than done, however, particularly with an aggressive rollout schedule. Key pieces of technology are still in development, the supply chain is in flux, and the complexity of integrating all of these pieces is mind-boggling. And while the semiconductor industry is exemplary at squeezing every last fraction of a penny out of the ecosystem, it remains to be seen if it can accelerate and extend that whole process to include technology that doesn’t yet exist.

— Ed Sperling contributed to this report.

Related Stories
Radar Versus LiDAR
Racing to drive down sensor costs for the automotive market.
LiDAR Completes Sensing Triumvirate
Technology will complement cameras and radar in autonomous vehicles.
Foundries Accelerate Auto Efforts
Push toward more electronics in cars turns what used to be a marginal business into a profitable one.
What Can Go Wrong In Automotive
Experts at the Table, part 3: Why power has become so important in car electronics; the challenges in making autonomous vehicles reliable enough; adding margin for safe modes of operation.