Neuromorphic Computing: Graphene-Based Memristors For Future AI Hardware From Fabrication To SNNs


A technical paper titled “A Review of Graphene-Based Memristive Neuromorphic Devices and Circuits” was published by researchers at James Cook University (Australia) and York University (Canada). Abstract: "As data processing volume increases, the limitations of traditional computers and the need for more efficient computing methods become evident. Neuromorphic computing mimics the brain's... » read more

Spiking Neural Networks Place Data In Time


Artificial neural networks have found a variety of commercial applications, from facial recognition to recommendation engines. Compute-in-memory accelerators seek to improve the computational efficiency of these networks by helping to overcome the von Neumann bottleneck. But the success of artificial neural networks also highlights their inadequacies. They replicate only a small subset of th... » read more

Energy-Efficient AI


Carlos Maciàn, senior director of innovation for eSilicon EMEA, talks about how to improve the efficiency of AI operations by focusing on the individual operations, including data transport, computation and memory. https://youtu.be/A3p_w7ENefs » read more

How The Brain Saves Energy By Doing Less


One of the arguments for neuromorphic computing is the efficiency of the human brain relative to conventional computers. By looking at how the brain works, this argument contends, we can design systems that accomplish more with less power. However, as Mireille Conrad and others at the University of Geneva pointed out in work presented at December's IEEE Electron Device Meeting, the brain... » read more

Neuromorphic Computing: Modeling The Brain


Can you tell the difference between a pedestrian and a bicycle? How about between a skunk and a black and white cat? Or between your neighbor’s dog and a colt or fawn? Of course you can, and you probably can do that without much conscious thought. Humans are very good at interpreting the world around them, both visually and through other sensory input. Computers are not. Though their sheer... » read more

Power/Performance Bits: Nov. 5


Even the world's best supercomputers are staggeringly inefficient and energy-intensive machines. Our brains have upwards of 86 billion neurons, connected by synapses that not only complete myriad logic circuits, but continuously adapt to stimuli, strengthening some connections while weakening others. We call that process learning, and it enables the kind of rapid, highly efficient computational... » read more