Memristors: Nanofluidic neural networks; tunable relaxation time; analog in-memory computing.
Engineers from EPFL developed a functional nanofluidic memristive device that relies on ions, rather than electrons and holes, to compute and store data.
“Memristors have already been used to build electronic neural networks, but our goal is to build a nanofluidic neural network that takes advantage of changes in ion concentrations, similar to living organisms,” said Aleksandra Radenovic of the Laboratory of Nanoscale Biology (LBEN) in EPFL’s School of Engineering, in a statement.
“We have fabricated a new nanofluidic device for memory applications that is significantly more scalable and much more performant than previous attempts,” said Théo Emmerich, an LBEN postdoctoral researcher, in a statement. “This has enabled us, for the very first time, to connect two such ‘artificial synapses’, paving the way for the design of brain-inspired liquid hardware.”
To make the devices, dubbed highly asymmetric channels (HACs), researchers created a nanopore at the center of a silicon nitride membrane, then added palladium and graphite layers to create nano-channels for ions. In a press release, they explained how it works: “As a current flows through the chip, the ions percolate through the channels and converge at the pore, where their pressure creates a blister between the chip surface and the graphite. As the graphite layer is forced up by the blister, the device becomes more conductive, switching its memory state to ‘on’. Since the graphite layer stays lifted, even without a current, the device ‘remembers’ its previous state. A negative voltage puts the layers back into contact, resetting the memory to the ‘off’ state.”
The HACs were then immersed in an electrolyte water solution containing potassium ions, but other ions could be used to tune the memory of the device. Connecting two of these HACs with an electrode formed a logic circuit based on ion flow. The team next plans to connect a network of HACs with water channels to create fully liquid circuits. [1]
Researchers from the University of Michigan, University of Oklahoma, Cornell University, and Pennsylvania State University created a memristor with a ‘relaxation time’ that can be tuned, enabling AI systems to process time-dependent information.
“We anticipate that our brand-new material system could improve the energy efficiency of AI chips six times over the state-of-the-art material without varying time constants,” said Sieun Chae, a recent U-M Ph.D. graduate in materials science and engineering and now assistant professor of electrical engineering and computer science at Oregon State University, in a release.
The team built the materials on the superconductor YBCO, made of yttrium, barium, carbon, and oxygen. While it has no electrical resistance at temperatures below -292 Fahrenheit, its crystal structure guided the organization of the magnesium, cobalt, nickel, copper, and zinc oxides in the memristor material.
By changing the ratios of these oxides, the team achieved time constants ranging from 159 to 278 nanoseconds. The simple memristor network they built learned to recognize the sounds of the numbers zero to nine. Once trained, it could identify each number before the audio input was complete.
“I think there are pathways to making these materials scalable and affordable,” said John Heron, U-M associate professor of materials science and engineering, in a release. “These materials are earth-abundant, nontoxic, cheap and you can almost spray them on.” [2]
Researchers from the University of Massachusetts Amherst, University of Southern California, and TetraMem implemented in-memory computing with analog memristor technology that is capable of solving complex scientific problems while using much less energy than traditional approaches.
“In this work, we propose and demonstrate a new circuit architecture and programming protocol that can efficiently represent high-precision numbers using a weighted sum of multiple, relatively low-precision analog devices, such as memristors, with a greatly reduced overhead in circuitry, energy and latency compared with existing quantization approaches,” said Qiangfei Xia, professor of electrical and computer engineering at UMass Amherst, in a release. “This technology is not only good for low-precision, neural network computing, but it can also be good for high-precision, scientific computing.”
In a proof-of-principle demonstration, the memristor solved static and time-evolving partial differential equations, Navier-Stokes equations, and magnetohydrodynamics problems. [3]
[1] Emmerich, T., Teng, Y., Ronceray, N. et al. Nanofluidic logic with mechano–ionic memristive switches. Nat Electron 7, 271–278 (2024). https://doi.org/10.1038/s41928-024-01137-9
[2] Yoo, S., Chae, S., Chiang, T. et al. Efficient data processing using tunable entropy-stabilized oxide memristors. Nat Electron (2024). https://doi.org/10.1038/s41928-024-01169-1
[3] Wenhao Song et al., Programming memristor arrays with arbitrarily high precision for analog computing. Science 383, 903-910 (2024). https://doi.org/10.1126/science.adi9405
Leave a Reply