How The Brain Saves Energy By Doing Less

No matter how efficient they become, neuromorphic computers are fundamentally different than human brains.


One of the arguments for neuromorphic computing is the efficiency of the human brain relative to conventional computers. By looking at how the brain works, this argument contends, we can design systems that accomplish more with less power.

However, as Mireille Conrad and others at the University of Geneva pointed out in work presented at December’s IEEE Electron Device Meeting, the brain is not efficient at all relative to other bodily systems. In adult humans, it accounts for 20% of the body’s total energy use, and even more in children. Humans devote a much larger fraction of available energy to the brain than other primates do, a potentially risky strategy in the face of food scarcity. Some evolutionary biologists suggest that human intelligence is only possible because proto-humans’ omnivorous diet and control of fire increased the availability of energy for the brain.

Yet for all that investment of energy, the brain’s switching mechanisms fall far below the reliability standards we set for electronic computing devices. The probability that a synaptic spike generated by an upstream neuron is transmitted to downstream neurons can be as low as 20%. Sixty percent of the brain’s energy is consumed in the form of adenosine triphosphate (ATP) by the ion pumps that control synaptic conductance, yet conductance remains low.

Julia Harris and colleagues at University College London varied the conductance of post-synaptic neurons in prepared sections of rodent brain tissue. They found that either increasing or decreasing the conductance relative to the physiologically “normal” value caused the synapse to transmit either more or less information, as would be expected. However, when they defined efficiency in terms of the amount of information transmitted per molecule of ATP consumed, the physiological conductance value turned out to be optimal.

This result seems paradoxical at first glance. Why is a lossy transmission better? It’s important to remember that every signal consumes energy, and that most stimuli are repetitive or unimportant. Many features of a landscape change slowly, if at all. Hunters and prey species alike are more concerned by what might have changed. Reducing the number of signals propagating through the brain reduces the overall energy consumption of this highly inefficient system.

Neuromorphic computers are not brains. It is possible to take the analogy to biological brains too far. If the goal is to match the efficiency of brain computation, though, it’s useful to remember that brains are efficient in part because of what they don’t do.

Related Stories
What’s Next In Neuromorphic Computing
Why commercialization will require improvements in devices and architectures.
Neuromorphic Computing: Modeling The Brain
Competing models vie to show how the brain works, but none is perfect.
3D Neuromorphic Architectures
Why stacking die is getting so much attention in computer science.
Toward Neuromorphic Designs
From synapses to circuits with memristors.
Terminology Beyond Von Neumann
Neuromorphic computing and neural networks are not the same thing.

Leave a Reply

(Note: This name will be displayed publicly)