Progress On General-Purpose Quantum Computers

Repeatable results and behavior, but still plenty of issues to solve.


The race is on to scale up quantum computing, transforming it from an esoteric research tool into a commercially viable, general-purpose machine.

Special-purpose quantum computers have been available for several years now. Systems like D-Wave’s Advantage focus on specific classes of problems that are amenable to modeling as quantum systems. Still, the ultimate goal of having a general purpose quantum computer remains elusive.

In 2000, a seminal paper by physicist David DiVincenzo, then at IBM, listed criteria for a practical quantum computer. Among other requirements:

  • The quantum state of interest must be stable. That is, it must be possible to preserve it long enough to actually do the calculation.
  • The qubit technology being used must be scalable. It must be possible to create large numbers of identical qubits, and to propagate information between them. In particular, qubits must be able to interact over distances while preserving their superposition of quantum states.
  • A set of operations must be available that manipulate the quantum state without collapsing it to a single measurement.
  • It must be possible to define the initial state of the qubits at the beginning of the computation, and measure the result at the end.

More than 20 years later, though, systems fulfilling what have come to be known as the DiVincenzo Criteria have yet to emerge. The current state of the art is what John Preskill, professor of theoretical physics at CalTech, described as “noisy intermediate scale quantum (NISQ) technology — systems with between 50 and 100 qubits and limited error correction capabilities.”

Preskill lauded NISQ systems as important research tools. They can simulate more complex quantum systems than conventional digital computers, but the commercial applicability of such devices is unclear. Rather, the quantum computing future depends on development of a general-purpose, gate-based, Turing-complete quantum computer. Such a system needs, first and foremost, the ability to entangle large numbers of identical qubits, and to maintain that coherence for long enough to perform calculations.

Theoretically, a quantum superposition is a construct in which one or more quantum particles simultaneously occupy all potential states, with a probability distribution defined by the Schrödinger equation for the system. Measuring the system forces it to take a single value, “collapsing” the superposition. In quantum computation, this result is the desired answer.

Fig. 1: Quantum computer with arrow pointing to optical fiber used to get signals to qubits. Source: NIST

Managing noise and error rates
Unfortunately, many different interactions between the qubits and the surrounding system act as “measurements” from the point of view of the quantum state. The interaction between carefully controlled qubit states and the relative disorder of the surrounding material is not well understood. It’s clear, though, that electrical noise and lattice vibrations, among other things, can reduce coherence time.

The link between noise and coherence time imposes a tradeoff. Iuliana Radu, director of quantum and exploratory computing at Imec, explained that the noise tolerance of a system depends on the energy separation between the qubit’s two states. A low separation reduces the amount of energy needed to initialize the qubits, but also reduces noise tolerance. For this reason, Josephson junctions, silicon spin qubits, and similar designs operate at milliKelvin temperatures, just above absolute zero.

Fig. 2: Imec’s all-silicon qubit. Source: Imec

Furthermore, Yutaka Tabuchi, group leader for the superconducting quantum circuit group at the RIKEN Center for Quantum Computing, noted at this summer’s VLSI Technology Symposium that qubits have no voltage threshold. Digital transistors are somewhat protected from noise by the need to exceed a transistor’s threshold voltage in order to turn it on. Noise in quantum systems, however, does not need to actually flip an electron spin in order to change the result of a calculation.

Any system that offers appropriate quantum states can theoretically serve as the basis for a quantum computer. In practice, though, the integrated circuit industry’s ability to manufacture large numbers of nearly identical, nanometer-scale elements has made semiconductor quantum dots a leading contender. Silicon quantum dots in particular were the focus of several papers at this year’s VLSI Technology Symposium. Silicon is of interest because, unlike III-V semiconductors, its most common isotope, 28Si, has no nuclear spin. In the MRS Bulletin, Mark Gyure, adjunct professor of electrical and computer engineering at UCLA, and his colleagues explained that nuclear spins can interact with the spin of a confined electron, degrading coherence time. 29Si accounts for about 4.7% of natural silicon, and has a 1/2 nuclear spin. As previously reported, Intel has demonstrated improved coherence times in qubits made from isotopically pure silicon.

At the VLSI Technology Symposium, researcher N. I. Dumoulin Stuyck and colleagues at Imec and KU Leuven demonstrated the use of single electron transistors for qubit readout via charge sensing, combined with single and double quantum dot qubits. Their approach, using electron spin resonance to initialize qubits, achieved repeatable tunnel coupling between adjacent dots with a lifetime up to 580 ms using an 0.4 Tesla field. Their work was also notable for the use of polysilicon gates to minimize thermal expansion mismatch, a serious issue at cryogenic temperatures.

I-Hsiang Wang, associate professor at Taiwan’s National Yang Ming Chiao Tung University, and his colleagues explained that there are two ways to define a quantum dot. A substrate with good isolation characteristics, like fully depleted silicon-on-insulator wafers, can use lithographic patterning to make a physical barrier between quantum wells. This strong confinement helps to reduce noise, but aligning electrodes for individual qubits is challenging. Alternatively, a self-aligned gate electrode can raise and lower the tunneling barrier around each quantum dot. This approach requires a lot of electrodes, and the relatively weak electron confinement increases noise sensitivity. This group made Ge quantum dots by selective oxidation and Ostwald ripening of SiGe pillars. Because growth of SiO2 is energetically favorable relative to GeO2, the germanium fraction in the original SiGe pillar can be tuned to adjust the size of the quantum dots.

Reducing the separation between the qubit and initializing magnet increases the effective field when silicon qubits use magnetic fields to initialize and manipulate electron spins, researcher Shota Iizuka and colleagues at Japan’s AIST Institute explained. In conventional integration, the magnet is above the qubit, which increases separation between the two. Moreover, variability in magnet dimensions leads to variation in the qubit behavior. They used a self-aligned process derived from the IC industry to bury nanomagnets in the substrate, greatly reducing lithography variation. First, a fin formation process established the magnet width. Then, an SiO2 trench etch set the magnet bottom. Finally, cobalt etching defined the magnet top.

Error correction and the bandwidth problem
Regardless of the specific qubit design, error correction is likely to be a necessity for a general-purpose quantum computer. If — or when — the quantum superposition is lost for any reason, the system needs to be able to recover it. Most error correction schemes proposed so far depend on redundancy, essentially creating multiple copies of the desired quantum state. Each logical qubit is assembled from hundreds or even thousands of physical qubits. Generally speaking, Radu said, the higher the error rate, the more physical qubits are needed to ensure accurate results. Low error rates are essential for large algorithms.

The sheer volume of devices required, combined with the need for cryogenic operation, introduces significant implementation challenges. For example, under the DiVincenzo Criteria, it must be possible to initialize and read each qubit independently. If the control electronics for the system are located outside the cooler, then potentially thousands of wires need to run between the control electronics and the qubits. Obstacles to this approach include signal strength degradation, crosstalk, and the physical difficulty of routing thousands of shielded wires into a milliKelvin refrigerator.

RIKEN’s Tabuchi said that in digital circuits, signals “cascade:” the output of one gate can be input for the next. As long as drive current is adequate, signal integrity is relatively independent of wire length. Quantum circuits, in contrast, have analog inputs/outputs, and analog control voltages, resulting in signal loss over long distances. Similarly, the existence of a voltage threshold mitigates the effects of crosstalk between control lines in digital circuits. In quantum circuits, even 1% crosstalk matters.

One potential solution is optical fiber. It supports enormous bandwidth, it’s temperature-independent, and it can accommodate high-frequency signals. Florent Lecocq, a research scientist at the National Institute of Standards and Technology (NIST), and his colleagues demonstrated optical control of superconducting qubits. Alternatively, the RIKEN group sought to reduce bandwidth requirements by implementing error correction at the qubit level before transmitting the signal out to the ambient temperature world. Their approach, based on symmetry, applied identical signals to symmetrically related superconducting qubits.

Overall, the outlook for quantum computing is mixed. Twenty years after DiVincenzo’s paper, fundamental questions about the architecture of potential quantum computers remain unanswered. As Imec’s Radu noted, there is not yet a quantum equivalent to silicon’s standard cell. Device and material properties at cryogenic temperatures are still poorly understood.

On the other hand, researchers are using silicon-like processes to make devices with repeatable, clearly defined behavior. We don’t have quantum microprocessors yet, but we may be starting to assemble the building blocks.


Related Stories
The Race To Make Better Qubits
Quantum Computing Knowledge Center
The Great Quantum Computing Race
The Long Road To Quantum Computing

Leave a Reply

(Note: This name will be displayed publicly)