Neuromorphic Chips & Power Demands


Research paper titled “A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware,” from researchers at Graz University of Technology and Intel Labs.


“Spike-based neuromorphic hardware holds the promise to provide more energy efficient implementations of Deep Neural Networks (DNNs) than standard hardware such as GPUs. But this requires to understand how DNNs can be emulated in an event-based sparse firing regime, since otherwise the energy-advantage gets lost. In particular, DNNs that solve sequence processing tasks typically employ Long Short-Term Memory (LSTM) units that are hard to emulate with few spikes. We show that a facet of many biological neurons, slow after-hyperpolarizing (AHP) currents after each spike, provides an efficient solution. AHP-currents can easily be implemented in neuromorphic hardware that supports multi-compartment neuron models, such as Intel’s Loihi chip. Filter approximation theory explains why AHP-neurons can emulate the function of LSTM units. This yields a highly energy-efficient approach to time series classification. Furthermore it provides the basis for implementing with very sparse firing an important class of large DNNs that extract relations between words and sentences in a text in order to answer questions about the text.”

Find the technical paper here. Published 2021.

arXiv:2107.03992v2 Philipp Plank, Arjun Rao, Andreas Wild, Wolfgang Maass.

Visit Semiconductor Engineering’s Technical Paper library here and discover many more chip industry academic papers.

Leave a Reply

(Note: This name will be displayed publicly)