Artificial Intelligence: Let Us Get The Math Right First!

Data path verification with formal for AI designs.

popularity

Artificial intelligence is a hot topic these days and therefore doesn’t require a repeat of the current and future potential uses for AI. For most people, it means technology advancements on the software side. But if you ask people who are very close to this technology domain, building your own optimized hardware chips is where a significant part of the competitive edge lies.

A few days back, I was driving home and listening to NPR (National Public Radio) and one technology expert was trying to explain AI to her audience. As per that talk, AI at a very high level is collection of highly optimized algorithms working on a large volume of data. If we translate this into what is needed on the hardware side, a good part of it is hardware implemented mathematical operations, which we typically see in traditional CPU designs, such as addition, subtraction, multiplier, square root, etc. Graphics chips have also shown to be very powerful aids in accelerating AI algorithms (especially training) and these designs have well known FPUs. Specialized machine learning processors, such as Google’s Tensor Processing Unit (TPU), excel at machine learning tasks and employ large arrays of arithmetic processing units, including matrix multiplication and fused multiply-add structures.

In the verification community it is a very well-known fact that traditional simulation-based verification methods do not scale well for these mathematical operations. For an example, to verify a 64-bit adder one needs 2128 test vectors, which is impossible to simulate even with the fastest logic simulator or with all the extensive compute resources available on the planet.

Formal verification is known to be exhaustive and from a Formal verification methodology standpoint mathematical logic falls under data path design. Traditional Formal engines are mostly fine-tuned for control logic and not for data path logic. So, how do we address this verification problem?

As part of data path designs, (in most cases) we have C/C++ reference models available for these mathematical functions and RTL logic can be compared against these C/C++ models. C/C++ models are untimed and RTL has cycle accurate timing information. To achieve functional equivalence, transaction level equivalence is required.

Today, Formal solutions are available for data path verification where transaction level equivalence can be achieved between C/C++ models and RTL design. This is the most efficient way to address this specific verification problem.

To ensure we do not get ahead of ourselves, it is critical that we get the math right first before we start talking about AI.

To read more about data path verification, check out the press release for Synopsys VC Formal DPV, and learn how this new solution benefits any complex processor chip in AI, automotive or security that requires arithmetic operations.



Leave a Reply


(Note: This name will be displayed publicly)