Artificial synapse; quantum heat flow; clockless time measurement.
Recreating the brain
Stanford University and Sandia National Laboratories researchers have created an organic, high-performance, low-energy artificial synapse for neural network computing that aims to better recreate the way the human brain processes information, and could also lead to improvements in brain-machine technologies.
Alberto Salleo, associate professor of materials science and engineering at Stanford said, “It works like a real synapse but it’s an organic electronic device that can be engineered. It’s an entirely new family of devices because this type of architecture has not been shown before. For many key metrics, it also performs better than anything that’s been done before with inorganics.”
The artificial synapse mimics the way synapses in the brain learn through the signals that cross them — a significant energy savings over traditional computing, which involves separately processing information and then storing it into memory. Here, the processing creates the memory, the researchers explained.
They expect this synapse could eventually be part of a more brain-like computer, which could be especially beneficial for computing that works with visual and auditory signals such as in voice-controlled interfaces and driverless cars. Past efforts in this field have produced high-performance neural networks supported by artificially intelligent algorithms but these are still distant imitators of the brain that depend on energy-consuming traditional computer hardware, the team asserted.
The artificial synapse is based off a battery design, and consists of two thin, flexible films with three terminals, connected by an electrolyte of salty water. The device works as a transistor, with one of the terminals controlling the flow of electricity between the other two.
Like a neural path in a brain being reinforced through learning, the researchers program the artificial synapse by discharging and recharging it repeatedly, and have been able to predict within 1% of uncertainly what voltage will be required to get the synapse to a specific electrical state and, once there, it remains at that state.
Every part of the device is made of inexpensive organic materials. These aren’t found in nature but they are largely composed of hydrogen and carbon and are compatible with the brain’s chemistry. Cells have been grown on these materials and they have even been used to make artificial pumps for neural transmitters. The voltages applied to train the artificial synapse are also the same as those that move through human neurons.
The researchers added that this all adds up to the possibility that the artificial synapse could communicate with live neurons, leading to improved brain-machine interfaces. The softness and flexibility of the device also lends itself to being used in biological environments. Before any applications to biology, however, the team plans to build an actual array of artificial synapses for further research and testing.
Heat flow quantum limits
According to University of Michigan researchers, as gold is stretched into a strand one atom thick, an expressway for heat opens up in a process called a quantum of thermal conductance. They have now observed this phenomenon at room temperature for the first time.
A quantum of thermal conductance represents the largest possible heat flow through a channel in a material, and as such, the channel is a highway for the flow of heat. The researchers proved that gold atomic chains have such a channel at room temperature.
This finding provide insights into how heat flows in the atomic limit, and could inform development of next-generation integrated circuits and other nanotechnologies.
For more than two centuries, scientists and engineers have studied how heat flows. Joseph Fourier developed a law to describe heat transfer in matter in the early 1800s. Fourier’s law plays a central role in heat transfer and classical physics and is useful in designing devices, the researchers reminded.
But heat flow often needs to be enhanced as in the case of electronics, or reduced in thermoelectric devices for efficient conversion of heat to electricity, and so Fourier’s law works well for macroscale devices but breaks down in single-atom-wide wires.
So instead of looking at bulk materials as Fourier did, the team looked at a nanoscale wire that is just one atom thick, and asked how heat flows at the smallest possible dimensions.
Engineers have been working to answer this question for more than two decades. Theoretical predictions have suggested that at the atomic scale heat transport is governed by quantum mechanics and there is an upper limit, or maximum, to the heat flow possible through any single-atom-wide wire.
While this heat flow behavior governed by quantum mechanics has been theorized, it had only been observed at ultra-cold temperatures, and to create useful nanoscale systems, the effects need to be observed at room temperatures.
To this end, the U-M team developed picowatt-resolution heat flow sensors called “calorimeters” that were able to measure heat flows in single-atom strands of gold and platinum. The picowatt resolution achieved was 100 times finer than previous devices, enabling them to observe quantized heat flow properties in gold, they said.
The work involved international collaboration as well. Theorists Juan Carlos Cuevas, a professor at Universidad Autónoma de Madrid, as well as Fabian Pauly and Peter Nielaba, professors at the University of Konstanz in Germany, performed critical calculations using state-of-the-art computational tools to model the flow of heat in single-atom wires and provide a direct comparison to the experimental data, the U-M researchers said.
This work paves the way for probing the ultimate limits of energy and heat flow in a host of different materials, including atomic and molecular sized devices and organic molecules.
Measuring time without a clock
With implications for fundamental research and cutting-edge technology, EPFL scientists have been able to measure the ultrashort time delay in electron photoemission without using a clock. The work is a proof of principle that can trigger further fundamental and applied research, and deals with the fundamental nature of time itself to help understand the details of the photoemission process, but can also be used in photoemission spectroscopy on materials of interest such as graphene and high-temperature superconductors.
The researchers reminded that when light shines on certain materials, it causes them to emit electrons, called “photoemission.” Discovered by Albert Einstein in 1905, it won him the Nobel Prize but only in the last few years, with advancements in laser technology, have scientists been able to approach the incredibly short timescales of photoemission. But now, the EPFL team has determined a delay of one billionth of one billionth of a second in photoemission by measuring the spin of photoemitted electrons without the need of ultrashort laser pulses.
Although there has been great progress in using photoemission and spin polarization of photo-emitted electrons, the time scale in which this entire process takes places have not been explored in great detail. The common assumption is that, once light reaches the material, electrons are instantaneously excited and emitted. But more recent studies using advanced laser technology have challenged this, showing that there is actually a time delay on the scale of attoseconds.
However, the lab of Hugo Dil at EPFL, with colleagues in Germany, showed that during photoemission, the spin polarization of emitted electrons can be related to the attosecond time delays of photoemission. Interestingly, they have shown this without the need for any experimental time resolution or measurement — essentially, without the need for a clock by using a type of photoemission spectroscopy (SARPES) to measure the spin of electrons photo-emitted from a crystal of copper.