Knowledge Center
Navigation
Knowledge Center

Artificial Intelligence (AI)

Using machines to make decisions based upon stored knowledge and sensory input.
popularity

Description

Artificial intelligence is a concept in computer science in which machines “think” for themselves. No one knows exactly where or when the concept was started, but formal research into this idea generally is attributed to a 1956 project at Dartmouth College.

Artificial intelligence achieved some notoriety following the Arthur C. Clarke science fiction novel 2001: A Space Odyssey, which was published in 1968. In the book, a computer named HAL (one letter off from the IBM acronym) attempted to take over a space ship.

By the mid-1980s, artificial intelligence was considered the next big thing in computing. Every major computer maker had an AI research project, most of which were abandoned in the early 1990s due to insufficient processor speed, the high cost of memory, and slow networking speeds.

Moore’s Law and vast improvements in data movement have eliminated those bottlenecks, and AI has returned, both as a general category for development and in various subsets such as deep learning and machine learning. An artificially intelligent machine must utilize machine-learning algorithms to make choices based upon previous experience and data. The terms are often confusing, in part because they are blanket terms that cover a lot of ground, and in part because the terminology is evolving with technology. But no matter how those arguments progress, machine learning is critical to AI and deep learning.


Multimedia

Improving AI Productivity With AI

Multimedia

Application-Optimized Processors

Multimedia

Configuring AI Chips

Multimedia

Monitoring Heat On AI Chips

Multimedia

Why training data is so susceptible to hacking

Multimedia

AI Training Chips

Multimedia

Energy-Efficient AI

Multimedia

AI, ML Chip Choices

Multimedia

Building AI SoCs

Multimedia

Using ML

Multimedia

Connected Intelligence