The Multiple Faces And Phases Of AI

Confusion grows as AI finds its way into more applications.

popularity

AI is being used in more ways and more devices—and in more ways in those same devices—raising the level of confusion about exactly what people are talking about when they refer to AI and AI-enabled systems.

AI is both a tool and a process. It also is a thing, although not even remotely close to the singularity portrayed by Arthur C. Clarke in 2001. And as it proliferates, it’s becoming harder to distinguish one from the other. There also are subsets of AI, notably machine learning and deep learning, and literally dozens of neural network variants that enable AI to work.

On the tools side, AI is being added into everything from verification to manufacturing. It is a way of establishing a guide for what is considered an acceptable Gaussian distribution based upon huge amounts of data input. This is a huge efficiency improvement in a foundry, where the rule deck may be 1,000 pages or more, depending upon the process node. Being able to access data from billions of chips and determine what caused outliers in that data that resulted in faulty chips or yield issues can provide massive cost and time benefits both to the foundry and the chip designers. And being able to spot design errors prior to signoff can have a big impact on time and NRE costs, improving the likelihood of achieving first-time silicon with fewer post-manufacturing problems.

AI also is a process. It requires weighting of training data, and using that “trained” or clean data to draw inferences within an acceptable distribution of behavior. Within that process there are sub-processes, as well, to improve the value of that data. So there is pruning to reduce the size of the data set, quantization to get more out of each bit, and acceleration to improve the speed at which data pruned and quantized data can be utilized. As more data is collected, it is added back into that training data to increase the accuracy of the inferencing.

In addition, AI is a thing. IBM has Watson. Google has DeepMind. Microsoft, Facebook, Alibaba and Amazon have their own versions, which are primarily used behind the scenes. But in light of concerns about AI replacing people in a variety of jobs, including highly paid medical specialists, they have revised their marketing pitch. They are now calling AI a tool, which is significantly less threatening to people who spent 8 to 12 years in college. In this case, it’s probably more like a thing that can serve as a tool.

As AI infiltrates our lives in more ways, from natural language speech recognition to assisted and autonomous driving, and from little things we never see like improved battery performance or faster time to market, this will become even more of a blur. For any electronic device, AI may be part of the development process, part of the use model, and a significant part of the architecture. And it will likely communicate with other AI systems with increasing autonomy, morphing in ways that no one can even imagine at this point.

But at least we can be sure of one thing. AI is not a verb—yet.

Related Stories
AI Begins To Reshape Chip Design
Technology adds more granularity, but starting point for design shifts as architectures cope with greater volumes of data.
What Makes A Good AI Accelerator
Optimizing processor architectures requires a broader understanding data flow, latency, power and performance.
Building AI SoCs
How to develop AI chips when algorithms are changing so quickly.



Leave a Reply


(Note: This name will be displayed publicly)