Is Transformer Fever Fading?


The hottest, buzziest thing bursts onto the scene and captures the attention of the business press and even the general public. Scads of articles and videos are published about The Hot Thing. And then, in the blink of an eye, the world’s attention shifts to the Next New Thing! Are we talking about the latest pop song that leads the Spotify streaming charts? Perhaps a new fashion trend that... » read more

Neural Network Model Quantization On Mobile


The general definition of quantization states that it is the process of mapping continuous infinite values to a smaller set of discrete finite values. In this blog, we will talk about quantization in the context of neural network (NN) models, as the process of reducing the precision of the weights, biases, and activations. Moving from floating-point representations to low-precision fixed intege... » read more

Generative AI: Transforming Inference At The Edge


The world is witnessing a revolutionary advancement in artificial intelligence with the emergence of generative AI. Generative AI generates text, images, or other media responding to prompts. We are in the early stages of this new technology; still, the depth and accuracy of its results are impressive, and its potential is mind-blowing. Generative AI uses transformers, a class of neural network... » read more

(Vision) Transformers: Rise Of The Chimera


It’s 2023 and transformers are having a moment. No, I’m not talking about the latest installment of the Transformers movie franchise, "Transformers: Rise of the Beasts"; I’m talking about the deep learning model architecture class, transformers, that is fueling anticipation, excitement, fear, and investment in AI. Transformers are not so new in the world of AI anymore; they were first ... » read more

Achieving Greater Accuracy In Real-Time Vision Processing With Transformers


Transformers, first proposed in a Google research paper in 2017, were initially designed for natural language processing (NLP) tasks. Recently, researchers applied transformers to vision applications and got interesting results. While previously, vision tasks had been dominated by convolutional neural networks (CNNs), transformers have proven surprisingly adaptable to vision tasks like image cl... » read more

Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices


Abstract:  "Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Memory, in particular Phase Change Memory (PCM), for software-equivalent accurate i... » read more

There’s More To Machine Learning Than CNNs


Neural networks – and convolutional neural networks (CNNs) in particular – have received an abundance of attention over the last few years, but they're not the only useful machine-learning structures. There are numerous other ways for machines to learn how to solve problems, and there is room for alternative machine-learning structures. “Neural networks can do all this really comple... » read more