Manufacturing Bits: Sept. 14

Probabilistic computers; quantum AI computing.

popularity

Probabilistic computers
Sandia National Laboratories and others are developing what researchers call a probabilistic computer.

Instead of traditional computing, Sandia is developing a system with built-in randomness that computes information differently every time.

As part the research program, the Department of Energy awarded the project $6 million over the next three years to develop the idea. Sandia is working with Oak Ridge National Laboratory, New York University, the University of Texas at Austin and Temple University. The Sandia-led project is called COINFLIPS or CO-designed Improved Neural Foundations Leveraging Inherent Physics Stochasticity.

COINFLIPS (CO-designed Improved Neural Foundations Leveraging Inherent Physics Stochasticity). Source: Sandia National Labs with image by Laura Hatfield.

Probabilistic computing is designed to solve several problems. For example, a car has thousands of parts. But any one of these parts could break or malfunction during the wear-and-tear of driving. It’s difficult to know how long before a part will break in a car.

That’s where a probabilistic computer fits in. They could solve complex probability problems like this. They could also help scientists analyze sub-atomic particles, simulate nuclear physics experiments and process images faster.

“Probabilistic computing is assigning a likelihood to a range of solutions through sampling a model many times (for instance, by repeating a calculation using different random numbers) rather than using precise numerical calculations that are exactly the same every time. Our system will do this at the hardware level rather than the software level,” explained Brad Aimone, a scientist at Sandia, in an e-mail exchange.

What are the benefits or advantages over conventional computing? “There are two main advantages,” Aimone said. “Modern computers are exact in their calculations and are designed to get rid of noise. Designing a probabilistic computer that matches the uncertainty and noise that naturally exists in complex problems will allow us to build more predictive scientific models and more informative analyses.”

Probabilistic computing is also energy efficient. “The elimination of noise in today’s computers costs a lot of energy. By using an effective probabilistic computer, we can directly use the noise of the physical devices and microelectronics as a computational tool. We expect that this will result in dramatic energy savings in future systems,” Aimone said.

Work is underway to develop a computer architecture based on the technology. “We envision a new architecture that will be more like today’s neuromorphic (bio-inspired) computers than a conventional computer, and which adds randomness at a very low level,” said Sandia scientist Shashank Misra. “To be useful, the randomness can’t be either noise or random numbers for cryptography, and so we’re looking at magnetic tunnel junctions and tunnel diodes. Exactly how the logic, memory and random number generators will work together is a big focus of our research, and so we’re trying to be open minded about what different technologies, including different memories, could bring to the table.”

Quantum AI computing
The University of California at San Diego and Purdue University have developed quantum-like devices and materials to enable a neuromorphic neural network.

Researchers have combined supercomputing materials with specialized oxides, enabling an AI system that mimics the brain. The work is being conducted in UC San Diego’s Quantum Materials for Energy Efficient Neuromorphic Computing (Q-MEEN-C) center. Supported by the Department of Energy, the goal is to develop quantum materials for the development of an energy-efficient, fault-tolerant computer that is inspired and works like a brain. The program was originally launched in 2019.

This represents one of a multitude of efforts in the world that is developing artificial intelligence (AI). Today, machine learning is the most common form of AI. A subset of AI, machine learning uses a neural network in a system, which crunches vast amounts of data and identifies patterns. Then, the network matches certain patterns and learns which of those attributes are important. Many of today’s systems based on machine learning use traditional chip architectures like GPUs and processors.

Neural networks consist of multiple neurons and synapses. These are not biological organisms. A neuron could consist of a memory cell with logic gates. The neurons are daisy-chained and connected with a link called a synapse. Basically, an artificial neural network (ANN) has three layers—input, hidden, and output. Each consists of neurons, which are connected by a synapse.

“An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain,” according to Wikipedia. “Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. An artificial neuron that receives a signal then processes it and can signal neurons connected to it. The ‘signal’ at a connection is a real number, and the output of each neuron is computed by some non-linear function of the sum of its inputs. The connections are called edges. Neurons and edges typically have a weight that adjusts as learning proceeds. The weight increases or decreases the strength of the signal at a connection.”

Meanwhile, the industry also has been working on a non-traditional approach called neuromorphic computing, which is still several years away from being realized. Neuromorphic computing also uses a neural network. The difference is the industry is attempting to replicate the brain in silicon. The goal is to mimic the way that information is processed using precisely-timed pulses.

The University of California at San Diego and Purdue have put a new spin on neuromorphic computing. Instead of silicon, researchers have combined superconducting devices with Mott metal−insulator transition−based tunable resistor devices. More specifically, researchers are using Josephson junctions for the superconducting devices.

A Josephson junction includes a thin insulating layer, which is sandwiched by two superconducting metals. In operation, electrons pair up and tunnel through the junction. Resonators are components that excite spin waves, which convert signals into a DC voltage.

Josephson junctions are also being used in the field of quantum computing. In quantum computing, the information is stored in quantum bits, or qubits, which can exist as a “0” or “1” or a combination of both. The superposition state enables a quantum computer to perform multiple calculations at once. Quantum computing is still in its infancy, however.

Meanwhile, the University of California at San Diego and Purdue combined two quantum technologies–superconducting materials based on copper oxide and metal insulator transition materials that are based on nickel oxide. Then, they created loop devices, which could be controlled with helium and hydrogen. The devices are connected with each other.

“We present simulations of networks of circuits and devices based on superconducting and Mott-insulating oxides that display a multiplicity of emergent states that depend on the spatial configuration of the network,” said Alex Frañó, a professor at UC San Diego, in the Proceedings of the National Academy of Sciences (PNAS).

“Our proposed network designs are based on experimentally known ways of tuning the properties of these oxides using light ions. We show how neuronal and synaptic behavior can be achieved with arrays of superconducting Josephson junction loops, all within the same device. We also show how a multiplicity of synaptic states could be achieved by designing arrays of devices based on hydrogenated rare earth nickelates. Together, our results demonstrate a research platform that utilizes the collective macroscopic properties of quantum materials to mimic the emergent behavior found in biological systems,” Frañó said. Others contributed to the work.



Leave a Reply


(Note: This name will be displayed publicly)