中文 English

Easier And Faster Ways To Train AI


Training an AI model takes an extraordinary amount of effort and data. Leveraging existing training can save time and money, accelerating the release of new products that use the model. But there are a few ways this can be done, most notably through transfer and incremental learning, and each of them has its applications and tradeoffs. Transfer learning and incremental learning both take pre... » read more

There’s More To Machine Learning Than CNNs


Neural networks – and convolutional neural networks (CNNs) in particular – have received an abundance of attention over the last few years, but they're not the only useful machine-learning structures. There are numerous other ways for machines to learn how to solve problems, and there is room for alternative machine-learning structures. “Neural networks can do all this really comple... » read more

Developers Turn To Analog For Neural Nets


Machine-learning (ML) solutions are proliferating across a wide variety of industries, but the overwhelming majority of the commercial implementations still rely on digital logic for their solution. With the exception of in-memory computing, analog solutions mostly have been restricted to universities and attempts at neuromorphic computing. However, that’s starting to change. “Everyon... » read more

Making Sense Of New Edge-Inference Architectures


New edge-inference machine-learning architectures have been arriving at an astounding rate over the last year. Making sense of them all is a challenge. To begin with, not all ML architectures are alike. One of the complicating factors in understanding the different machine-learning architectures is the nomenclature used to describe them. You’ll see terms like “sea-of-MACs,” “systolic... » read more

Are Better Machine Training Approaches Ahead?


We live in a time of unparalleled use of machine learning (ML), but it relies on one approach to training the models that are implemented in artificial neural networks (ANNs) — so named because they’re not neuromorphic. But other training approaches, some of which are more biomimetic than others, are being developed. The big question remains whether any of them will become commercially viab... » read more

System Bits: Sept. 24


Quantum states Many companies and academic researchers are working on quantum computing technology, including the University of Buffalo. New research on two-dimensional tungsten disulfide (WS2) could open the door to advances in quantum computing, UB reports. In a paper published Sept. 13 in Nature Communications, scientists report that they can manipulate the electronic properties of th... » read more

Optimizing Deep-Learning Inference For Embedded Devices


Deep artificial neural networks (ANNs) have emerged as universal feature extractors in various tasks as they approach (and in many cases surpass) human-level performance. They have become fundamental building blocks of almost every modern artificially intelligent (AI) application, from online shop recommendations to self-driving cars. This whitepaper highlights how different challenges relat... » read more

Integrating Memristors For Neuromorphic Computing


Much of the current research on neuromorphic computing focuses on the use of non-volatile memory arrays as a compute-in-memory component for artificial neural networks (ANNs). By using Ohm’s Law to apply stored weights to incoming signals, and Kirchoff’s Laws to sum up the results, memristor arrays can accelerate the many multiply-accumulate steps in ANN algorithms. ANNs are being dep... » read more