Analog Accelerator For AI/ML Training Workloads Using Stochastic Gradient Descent (Imperial College London)


A new technical paper titled "Learning in Log-Domain: Subthreshold Analog AI Accelerator Based on Stochastic Gradient Descent" was published by researchers at Imperial College London. Abstract "The rapid proliferation of AI models, coupled with growing demand for edge deployment, necessitates the development of AI hardware that is both high-performance and energy-efficient. In this paper, w... » read more