Tools utilize reinforcement learning for a variety of applications, including developing AI chips.
Machine learning is increasingly being deployed across a wide swath of chips and electronics in automobiles, both for improving reliability of standard parts and for the creation of extremely complex AI chips used in increasingly autonomous applications.
On the design side, the majority of EDA tools today rely on reinforcement learning, a machine learning subset of AI that teaches a machine how to perform a specific task based on pattern recognition. Unlike image recognition in AI chips, which is based on training of massive data sets, machine learning can produce accurate results quickly using much smaller volumes of data. Synopsys, Cadence, Siemens, and others all have embraced reinforcement learning in their tools, and their automotive customers point to improved time to market for chips that offer better performance and meet stringent safety goals.
“Verification of these complex chips is also critical for ensuring safety and functionality so today’s most advanced tools utilize AI/ML to automate discovery of coverage holes in testing assuring that elusive gaps and by proxy elusive bugs are detected that if undetected could be catastrophic in the field,” said Thomas Andersen, vice president for AI and machine learning at Synopsys. He noted that in addition to meeting the power, performance, and area requirements, machine learning can help determine partitioning and spacing requirements for redundant on-chip functionality.
Today, the amount of machine learning used in making chips for vehicles, and the use of AI within those vehicles is growing. How quickly that will change is a matter of conjecture, but the trends are clear.
“There’s an answer for today, and there’s an answer for in a couple of years, and they are very different,” said David Fritz, vice president of hybrid-physical and virtual systems automotive and mil-aero at Siemens Digital Industries Software. “We start with system exploration, and there are tools to do just that. They allow for exploration of this complex space, and what the system looks like. For example, what do the ECUs look like? Where should the software be running? In the past it was pretty easy to just say, ‘Here’s a MATLAB model that models the functionality,’ push the button, generate some C code, and you’re done. The problem is that whole automated process is gone. It just does not work because the compute requirements are so off the charts because of the complexity of the car.”
The quintessential model, now quickly growing outdated, is the V diagram. “It used to be that you could just decompose this down to the minimum units, generate that little bit of C code from the model, put it all together, and lo and behold, it works,” Fritz said. “The problem is when a certain level of complexity is reached, it doesn’t work anymore, because when you put all the puzzle pieces back together you come to find out you’re missing two of the four corners and several pieces in the middle are missing. What’s happening today is the process of taking requirements and essentially identifying what a potential system architecture would look like, then simulating it and measuring it. Then, taking those measurements, comparing them against the requirements, and iterating ad nauseam.”
On the tool side, the focus may be on machine learning, but in the vehicles the full AI ultimately will be required to handle increasingly autonomous functionality. “EDA vendors have found new applications for [ML] to optimize, automate, and speed up SoC development flows,” said Thierry Kouthon, technical product manager for Security IP at Rambus. “Chip design is becoming increasingly complex due to increased density at lower geometries, low latency requirements, hyperscale design with billions of gates, and time-to-market pressure. As a result, it requires sophisticated expertise that may become increasingly scarce, and substantial investments will be needed to satisfy aggressive schedules.”
EDA vendors have deployed reinforcement learning to improve how SoCs are designed and manufactured. “It is used today in several areas of the SoC design flow, such as logic synthesis, verification, placement, routing, 3D integration, and design for test,” Kouthon said, noting that EDA companies are promoting reinforcement learning to accelerate and improve the quality of ASIC design flows, as well as to reduce the amount of resources involved in a semiconductor project.
Over the past several months, nearly all of the big EDA vendors have embraced some level of machine learning, and they are pushing further into the AI world as more relevant data is collected. “While there is still a large opportunity in front of us, and a large amount of innovation that’s possible, it’s just starting to get locked in,” said Rob Knoth, product management group director in the Digital & Signoff Group at Cadence. “You just have to open up the newspaper or read through some reports. If you look at what Tesla does with its Dojo supercomputer, you see how that overall system is influenced by, catered to, and designed not just by AI, but for AI. The chip is incredibly important, but the chip is only important as far as the context of the car or the context of the data center. And the massive amounts of data that are required to perform automotive functions to help improve quality and reliability, to address functional safety concerns — all these things are dancing together. It’s not one piece in isolation, and that’s the true beauty and potential that is only starting to get unlocked.”
AI/ML can play a critical role in numerous parts of the automotive design process.
“On one side, you can talk about automotive-specific things, but on another, you can step back and say, ‘These incredibly intelligent edge AI-enabled devices for a car are very similar to many other types of advanced semiconductors that are out there,'” Knoth said. “AI is no longer a future topic at our user conference, [it’s a current topic]. It is being used aggressively by all industries to do their daily work because it’s allowing the engineer to spend more time doing what engineers are uniquely and beautifully suited for — to look at the intention, look at the guiding things, exploring things, worrying more about the functions, as opposed to the day-to-day implementation. And AI is allowing a bigger proliferation of more complex, more differentiated bespoke silicon.”
Others agree. “From a chip architecture perspective, modern automotive chips are heavily targeted toward automation of driving aspects and safety features,” said Synopsys’ Andersen. “As such, they have essentially become in-car AI chips which implement complex CNN functions to analyze images, video and scenes and take the proper action for accident avoidance or towards full self-driving automation.”
In automotive specifically, the work is centered around quality, reliability, and safety.
“Quality is all about lowering your defective parts coming out of manufacturing (DPPM),” he said. “Reliability is all about making sure that you can stay functional over the lifespan of the product. Functional safety is basically saying, ‘If something breaks, let’s make sure that we cause no harm.’ Those all intently relate to automotive, and in each one of those AI is either actively being used to help meet those goals, or there is the potential based upon how they’ve been met in the past to have AI applied to them to either make test smarter, do better aging analysis, or be more clever, efficient, and effective about the safety mechanisms you put in.”
AI/ML fits into a number of applications and tools for automotive, and they often support each other. “Our techniques for the development play a big role because we have been wondering how to verify this AI,” said Frank Schirrmeister, vice president solutions and business development at Arteris IP. “Is there a structural verification for the CNN, DNN, or whatever is used? Does it functionally do multiplications correctly? Still, once you have trained it, there’s really very few ways to verify where you won’t have any outliers. This is why guardrails are needed, and that’s where systemic development challenges come in. You have your AI helping with vision in the car and recognizing things, but then you still have to guardrail it, and from a development perspective figure out the proper ways of graceful degradation. If it’s something that doesn’t make sense, how do you then gracefully fail or gracefully stop the car without killing anybody? That’s part of the development process considerations at a systemic level — how to guardrail the AI.”
Increasing AI considerations
All of these considerations are increasingly important as more AI/ML is added into vehicles, and used to create the chips in those vehicles.
Ron DiGiuseppe, senior marketing manager, automotive IP at Synopsys, sees the use of AI on a continuum. “There’s the mobility spectrum, ADAS Level 2, Level 2+, Level 3, Level 4, Level 5 to full self-driving. In that ADAS category of Level 2 ADAS automation — which are the applications that are deployed in cars now, including adaptive cruise control — that’s an AI application just for ADAS. It’s not self-driving. Another example is automatic emergency braking. When you’re driving down the street and a dog runs across, object detection would initiate an automatic emergency braking application. Those are deployed in cars now, many of which use vision-based AI. The best example is Mobileye, probably the most widely used in these AI applications in ADAS. Independent of self-driving, AI is being adopted in other applications within the car, but they’re still in development now. Not much is deployed in production.”
Going forward, DiGiuseppe said AI will be used for other applications in the car, such as infotainment driver monitoring systems, like gaze detection, to see where the driver is looking. “Also, in the powertrain for electric vehicles, the DC-to-DC converter is a good example of how AI could optimize that application for the battery management system. Usually, it’s an algorithm to optimize the charging/discharging of the battery pack. That’s another unusual adoption of AI, but the point is that it’s going to many different applications within the car.”
Electric motor sensor reduction is yet another application for AI. “There are hardware sensors in there that use predictive analytics with AI to perform that function. So instead of actually using hardware sensors in powertrain with an electric motor, you can use AI predictive analytics,” DiGiuseppe added.
How these AI algorithms ultimately will be deployed will change over time. The two key elements there are flexibility and size. “If they are relatively small models, eFPGA is a feasible approach,” noted Geoff Tate, CEO of Flex Logix. This may be particularly useful for designers looking to maximize PPA. eFPGAs are highly efficient for large vision models with megapixel images, and hundreds of layers like YOLOv5L6, Tate explained. They also can be reprogrammed in the field to take advantage of algorithm changes.
Current AI implementation
Building all of this into the design flow requires very early conceptual planning. “We call this ‘Left of Left,'” said Fritz. “It’s what needs to happen before you Shift Left, and is where complications come in. Now I’m making decisions, so I have to have a methodology that not only can look forward into the implementation and collect these metrics so I can see if I am going to meet my requirements. I also have to be able to take that design and pass it off to the actual implementers, whether they’re internal, at a supplier, or wherever they are. How do I do that across 100 different companies, most of whom have no idea what we’re talking about or how their piece fits into the bigger picture? How does IP protection come into all that? It’s a big challenge.”
Fritz believes this is one of the major reasons why Level 4 and Level 5 autonomy have been so delayed. “It isn’t necessarily because of the AI inside the vehicle, although certainly that’s a challenge. And it’s not about functional safety. It’s not about all those things anymore. We’re getting a good handle on that. But how do you make that work in a real world with all these different suppliers doing their own pieces? They’re all used to doing things their own way, and only focusing on their piece of this very large puzzle. Now they have to show that their piece of the puzzle is going to work with all the other pieces around it, none of which can you actually see. In any one of the big automotive groups, they may have several brands, and each of those brands tends to have their own way of doing things. So even within companies you control it’s extremely difficult to get some sort of consensus about how things need to be moving forward.”
Once that development process is in place, and done manually, the next generation is applying AI to that process. “The AI gets trained, and it’s going to have hundreds of different inputs,” Fritz said. “An input could be something as simple as, ‘Bandwidth never exceeds 60%’ or, ‘The inputs are going to be indirectly related to the types of requirements that go into the system.’ Once you have a high-level model that can run, create a digital twin, and then provide those metrics back, then the AI that sits over that looks at it and says, ‘Did I make things better or worse?’ It is trained over time to say, I realize that if I organize my structure in this way, or if I use, let’s say, the latest greatest Arm CPU and I can run that at 1 gigahertz, I can do more processing on this node of the system, which lowers my bandwidth. And now I can get away with 5 gigabit Automotive Ethernet, and can meet a cost requirement or weight requirement or a range requirement. The AI then will eventually take over for these system architects because there are too many dimensions, too many variables for any human to actually be able to figure this out.”
Conclusion
Over time, as system understanding improves to the point where there is a standard, off-the-shelf machine-learning problem that can identify the inputs and get results — and as design teams understand how to compare those results and adjust the design — then it will become more pervasive. “Image recognition is the same as every other problem,” Fritz said. “Once we have that, it will be the training of AI that will be unique for each brand and each OEM, and that will be the golden jewels. Nobody ever is going to touch that. Nobody is going to mess with it unless it’s run through incredibly complicated regression testing to make sure they haven’t somehow broken something. That’s the future.”
Still, the fundamentals should not be forgotten, Cadence’s Knoth said. “AI is here, it’s not in the future. Important, too, is the fact that you can’t forget the fundamentals. You’ve got to practice your lay-up shots. You’ve got to practice your free throws. You’ve got to be solid in all of your fundamentals, or doesn’t matter how cool the new toy is. This is doubly important in the automotive industry. AI can help you do a lot of things, but if you’re not paying attention to the quality, reliability, and safety of your part, if you’re not paying attention to your fundamental design and sign off methodologies, it’s not going to be a success.”
When it comes down to it, AI/ML offers many techniques to use for optimization purposes such as reinforcement learning, k-clustering, convolutional neural networks, generative adversarial networks, and more, Rambus’ Kouthon noted. “The application of all these techniques to various stages of the IC design and manufacturing flow is an active area of research that promises benefit to topics such as yield optimization or design for test and verification.”
Leave a Reply