Challenges For New AI Processor Architectures


Investment money is flooding into the development of new AI processors for the data center, but the problems here are unique, the results are unpredictable, and the competition has deep pockets and very sticky products. The biggest issue may be insufficient data about the end market. When designing a new AI processor, every design team has to answer one fundamental question — how much flex... » read more

Challenges Of Edge AI Inference


Bringing convolutional neural networks (CNNs) to your industry—whether it be medical imaging, robotics, or some other vision application entirely—has the potential to enable new functionalities and reduce the compute requirements for existing workloads. This is because a single CNN can replace more computationally expensive image processing, denoising, and object detection algorithms. Howev... » read more

There’s More To Machine Learning Than CNNs


Neural networks – and convolutional neural networks (CNNs) in particular – have received an abundance of attention over the last few years, but they're not the only useful machine-learning structures. There are numerous other ways for machines to learn how to solve problems, and there is room for alternative machine-learning structures. “Neural networks can do all this really comple... » read more

Making Lidar More Useful


Lidar, one of a trio of “vision” technologies slated for cars of the future, is improving both in terms of form and function. Willard Tu, director of automotive at Xilinx, talks with Semiconductor Engineering about different approaches and tradeoffs between cost, compute intensity and resolution, various range and field of view options, and why convolutional neural networks are so important... » read more

Edge-Inference Architectures Proliferate


First part of two parts. The second part will dive into basic architectural characteristics. The last year has seen a vast array of announcements of new machine-learning (ML) architectures for edge inference. Unburdened by the need to support training, but tasked with low latency, the devices exhibit extremely varied approaches to ML inference. “Architecture is changing both in the comp... » read more

Fast, Low-Power Inferencing


Power and performance are often thought of as opposing goals, opposite sides of the same coin if you will. A system can be run really fast, but it will burn a lot of power. Ease up on the accelerator and power consumption goes down, but so does performance. Optimizing for both power and performance is challenging. Inferencing algorithms for Convolutional Neural Networks (CNN) are compute int... » read more

The Growing Market For Specialized Artificial Intelligence IP In SoCs


Over the past decade, designers have developed silicon technologies that run advanced deep learning mathematics fast enough to explore and implement artificial intelligence (AI) applications such as object identification, voice and facial recognition, and more. Machine vision applications, which are now often more accurate than a human, are one of the key functions driving new system-on-chip (S... » read more

Spiking Neural Networks: Research Projects or Commercial Products?


Spiking neural networks (SNNs) often are touted as a way to get close to the power efficiency of the brain, but there is widespread confusion about what exactly that means. In fact, there is disagreement about how the brain actually works. Some SNN implementations are less brain-like than others. Depending on whom you talk to, SNNs are either a long way away or close to commercialization. Th... » read more

The MCU Dilemma


The humble microcontroller is getting squeezed on all sides. While most of the semiconductor industry has been able to take advantage of Moore's Law, the MCU market has faltered because flash memory does not scale beyond 40nm. At the same time, new capabilities such as voice activation and richer sensor networks are requiring inference engines to be integrated for some markets. In others, re... » read more

Defining And Improving AI Performance


Many companies are developing AI chips, both for training and for inference. Although getting the required functionality is important, many solutions will be judged by their performance characteristics. Performance can be measured in different ways, such as number of inferences per second or per watt. These figures are dependent on a lot of factors, not just the hardware architecture. The optim... » read more

← Older posts Newer posts →