Home
TECHNICAL PAPERS

Analog Accelerator For AI/ML Training Workloads Using Stochastic Gradient Descent (Imperial College London)

popularity

A new technical paper titled “Learning in Log-Domain: Subthreshold Analog AI Accelerator Based on Stochastic Gradient Descent” was published by researchers at Imperial College London.

Abstract
“The rapid proliferation of AI models, coupled with growing demand for edge deployment, necessitates the development of AI hardware that is both high-performance and energy-efficient. In this paper, we propose a novel analog accelerator architecture designed for AI/ML training workloads using stochastic gradient descent with L2 regularization (SGDr). The architecture leverages log-domain circuits in subthreshold MOS and incorporates volatile memory. We establish a mathematical framework for solving SGDr in the continuous time domain and detail the mapping of SGDr learning equations to log-domain circuits. By operating in the analog domain and utilizing weak inversion, the proposed design achieves significant reductions in transistor area and power consumption compared to digital implementations. Experimental results demonstrate that the architecture closely approximates ideal behavior, with a mean square error below 0.87% and precision as low as 8 bits. Furthermore, the architecture supports a wide range of hyperparameters. This work paves the way for energy-efficient analog AI hardware with on-chip training capabilities.”

Find the technical paper here. January 2025.

https://doi.org/10.48550/arXiv.2501.13181
Momen K Tageldeen, Yacine Belgaid, Vivek Mohan, Zhou Wang, Emmanuel M Drakakis



Leave a Reply


(Note: This name will be displayed publicly)