The Race Toward Quantum Advantage

Enormous amounts of money have been invested into quantum computing, but so far it has not surpassed conventional computers. When will that change?

popularity

Quantum computing has yet to show an advantage over conventional computing, but huge sums of money are betting it will. So far that hasn’t happened.

Early quantum computers were created in the mid-1990s after mathematicians had demonstrated the effectiveness of applying quantum approaches to some problems. At that stage they were simulated using conventional computing, but it started the race to build an effective quantum computer that could solve the problems directly.

In 2019, some companies issued claims that quantum computers had, for the first time, demonstrated superiority over conventional computers. Most experts disagree, arguing we are still several years away from that crossover. While quantum computers have reached that point in theory, they are still not there in practice.

“Quantum computers exist, but it’s in a very early phase,” said Heike Riel, fellow and head scientist for quantum and information technologies at IBM, in a keynote talk at the recent Design Automation Conference. “We are now at the stage where we can develop a quantum computing roadmap, and this will lead us toward quantum advantage. We are not yet there. Quantum advantage is when a quantum computer can compute a task, which classical computers cannot not do, or can demonstrate this quantum advantage where they calculate things faster, more precisely, or with less energy.”

Quantum advantage is still a ways off. “Governments worldwide (U.S., Europe, China, etc.) are pouring funding into the quantum computing direction,” says Mohamed Hassan, quantum solutions planning lead for Keysight. “They are taking the race toward enabling this technology, due to its huge potential and impact on communications and security. Major companies (IBM, Google, Intel) are committed to large investments in building a practical quantum computer with a quantum advantage by 2030.  Many quantum computing startups (IonQ, Rigetti, D-Wave, among others) are now public in the stock market.”

Understanding the problem they are solving is not conventional, however. One example for a quantum application is an optimization problem. “We need to find the most minimal solution and, in the worst case, we have to go through all the possibilities numerically,” said Robert Wille, professor at the Technical University of Munich, during a DAC TechTalk. “In the design automation domain, nobody solves any kind of complex global optimization problem by enumeration. We have dedicated design automation methods, SAT solving, decision diagrams, all those cool technologies. But eventually we still have a complexity problem where we reach limits. And this is where quantum computing comes in. Quantum computing allows you to have inputs that are initialized to be a zero and one at the same time. Then you process your corresponding problem, and conceptually you’re getting something like all possible results at the same time. The quantum computer allows you to solve a problem with all possible inputs. Then, you’ll get an output state, which is somehow encoded in the quantum output state that inherits or includes all possible solutions, including the solution you’re interested in. But there’s a catch. When I try to measure this output state, I only get one of the possible results with a certain probability. I just get one arbitrary solution out of that. But you can use tricks, which allow you to increase the probability of your desired result.”

There are many challenges that need to be solved before quantum advantage becomes a reality. While the public focus is on the hardware, there remain lots of discussions about how to measure the power or ability of a quantum computer. There is a large software toolchain required both to create better quantum computers, and the software that can take applications and effectively map them on to these computers. This is the equivalent of the software toolchain, including development environments, compilers, and debuggers. Early examples of all of these are being developed.

The EDA industry traditionally started with electronic systems, described in a high-level language, compiling and optimizing that down to transistors, and eventually shapes, which are then transferred to silicon. A similar process is required for quantum algorithms, which must be mapped onto quantum structures. Simple applications are being mapped by hand today, but this makes it difficult to scale.

“Not many people are working on design automation for quantum computing,” says TUM’s Wille. “And this is a shame, because I really believe that our community, with its background and expertise, the methods, the tools our community develops, could be a huge help for quantum computing experts — for the people who are developing quantum computing technologies and quantum computing applications.”

The basics
In order to understand the task at hand, it is necessary to get a basic understanding of the problem. “The primary inputs to a quantum computer are complex numbers, unlike the binary inputs to a classical computer,” says IBM’s Riel. “A qubit describes the probability of whether you find the state one or zero after your measurement. And then you also have quantum circuits (see figure 1). These are a set of quantum gate operations on qubits. We call this a unit of computation for quantum computing. The scaling behavior for the performance of quantum computing is determined by 2n states. If you go beyond 100 qubits, then you come to numbers that are no longer possible to be simulated by a classical machine — not today, and not in the future.”

Fig. 1: Quantum bits and circuits. Source: IBM

Fig. 1: Quantum bits and circuits. Source: IBM

Measuring performance
According to IBM, there are three key metrics to measure performance. One is scale, which is the number of qubits. But you also have to define the quality of the qubits. That currently is defined by circuit fidelity and speed, which is the circuit execution speed. When talking about quantum advantage, total power to perform the operation also should be considered. While the quantum computer may only consume a small amount of power, cooling costs must be included.

Several different technologies are in use for quantum computers, including trapped ion, spin, or quantum dots, superconducting circuits, and many others. IBM uses superconducting circuits based on a Josephson junction, which acts as a non-linear inductor. “They provide an anharmonic energy spectrum, which allows you to control your information in the states zero and one,” says Riel. “The coherence times are in the range of 100 microseconds up to milliseconds. Importantly, the gate time, or how long it takes for the gate operation to happen, is about 10 to 50 nanoseconds. For the two qubit gates, it takes a bit longer at about 500 nanoseconds. We control these qubits by microwave pulses that can shape or control the qubit state. Microwave resonators talk to the qubits to initialize the state, control the gate operations, and also read out the state.”

The number of qubits has been rising at a steady pace. IBM had a 27-qubit machine in 2019. One year later it had 65, and a year after that, 127. Today, they are at 433. This is, at least in theory, above the range at which classical computers can operate. Each generation of machine has required new innovations to make it possible. The electronics necessary to control the quantum computer also have gone through significant changes.

Quality is an issue. There are several aspects to this. One is coherence time, and this has improved by 100,000X over two decades. This is the time that a qubit can reliably hold information.

Designing the qubits requires a new breed of EDA tools. “Each type of quantum computer uses different physical effects to enable the entanglement,” says Marc Swinnen, director of product marketing at Ansys. “One of the most common ones is using superconducting elements. They need to be able to analyze the electrical and magnetic interactions of those elements. We have worked with teams creating superconducting elements and created a special tool, RaptorQu, which is an electromagnetic modeling tool. They want to be able to simulate the electrical parameters and the behavior of their superconducting circuits and the interaction with electromagnetic fields. As an example, superconductors cannot tolerate magnetic fields within themselves. It changes the behavior of the material, so you need to have a tool that can understand that.”

Bridging the gap between physicists and EDA is not always easy. “The quantum EDA field will have to cross barriers between physics and engineering,” says Keysight’s Hassan. “This is a daunting task. The two fields typically use different terminologies and nomenclatures. It is still to be seen how this will go with both sides of the problem.”

The second factor is speed. “We have defined circuit layer operations per second (CLOPS),” says Riel. “This is a holistic benchmark, where we can compare different chips and different implementations for the performance of the system. The key ingredients are the circuit execution, the circuit delay, the compilation, the runtime compilation, and also data transfer. The interaction between quantum computing and classical computing and electronics is key.”

The third factor is fidelity, which is a measure of the error rate. “Errors can happen at all stages, such as initialization, gate operation, and during measurement,” adds Riel. “In classical computing, this is often dealt with by using redundancy. This, however, doesn’t work for quantum because quantum information cannot be copied. We have to utilize a different approach such as parity checks for error correction, but this comes with a large overhead.”

IBM also is working toward error mitigation, using other techniques such as probabilistic error cancellation and zero noise extrapolation. This latter technique was demonstrated back in 2019, but now the company has managed to make it work in a 127-qubit chip. Improvement is a slow process, as is often seen in semiconductors. A continuous path has been defined by IBM, using error mitigation in a smart way and other algorithmic tricks in order to get toward quantum advantage.

IBM and others produced a roadmap for the future that includes advances in the hardware, as well as some of the support capabilities that will also be required, such as circuit knitting, which allows quantum and classical computing to be joined. It also recognizes the need for abstractions to be constructed so that quantum computers can be used by people other than physicists. (See figure 2)

Fig. 2: Quantum computing roadmap within IBM. Source: IBM

Fig. 2: Quantum computing roadmap within IBM. Source: IBM

Quantum software
Many pieces of hardware have been let down by the software content. “It has to be integrated into middleware and into software, so that you as a user don’t have to worry about the quantum computation below,” says Riel. “You just want to have a quantum engine, which will enable you to quickly utilize computing resources that were not available to you with classical computing.”

We are not there yet. “If you want to realize a quantum computing application today, you have to become a quantum computing expert,” says Wille. “You will have to be half in quantum computers and half a quantum physicist in order to realize an application. We need to use design automation expertise to address this problem. One of the big paradigm differences is that every operation in quantum is inherently reversible. That means things like multiplication do not exist. Reversible means you can execute the operation from the inputs to the outputs, and from the outputs to the inputs.”

Then you have to deal with the concepts of superposition and entanglement. Wille believes that everything in quantum can be reduced to operations on matrices and vectors.

“A quantum state can be represented in terms of a vector representing corresponding amplitudes for basic states,” he said. “A quantum operation is a unitary operation, which describes a function such as a Hadamard rotation or phase shift. The real challenge for design automation experts is not understanding what the operations are. It is developing tools. A simulator takes a vector and a matrix, and if you multiply them together, you get an output vector that represents your output state.”

The problem is that these vectors and matrices are exponential in size.

Application synthesis
Another problem that needs to be addressed is mapping an application onto a quantum computer. “This is compilation and synthesis, which is literally the same thing we do today when you have an operation in a high-level description (see figure 3), said Wille. “And then you’re going through a process with several steps, and you realize that in terms of a low-level description — something which can be executed on a machine. It’s very similar to a software compilation flow or to a synthesis problem, where we synthesize a high-level description into elementary operations for a particular technology. Many of the problems are very similar. So suddenly, we can also do that.”

Fig. 3: Quantum application synthesis. Source: TUM

Fig. 3: Quantum application synthesis. Source: TUM

What is not shown in figure 3 is the architecture of the quantum computer that the application is going to be mapped onto. This may provide limitations in that not all qubits interface to all others.

At TUM, Wille has been working toward the creation of an open-source toolkit based on decision diagrams. This is a technique that was introduced in the 1990s, and can be applied to quantum computing.

EDA companies are starting to develop the necessary tools and flows. “Currently, the quantum hardware design cycle spans multiple tools, in multiple domains, in a discordant fashion with multiple gaps in between,” says Hassan. “These are typically filled by extra effort that is highly dependent on the knowledge and experience of the designer. It presents a steep knowledge barrier and keeps many engineers out of this emerging field. It is very different from how the current mature EDA design cycle is devised, for instance, for designing integrated circuits. Quantum EDA is envisioned to close these gaps, streamline the design cycle, reduce the knowledge barrier, and enable many engineers to jump into this new field.”

Conclusion
Quantum computers are progressing at a rapid rate, and may have reached the physical size whereby, in theory, they can present a quantum advantage. But the reality is that they are not quite there yet. It is almost the classic case where software is an afterthought. If it were not for the number of companies involved, and the amount of money that is pushing further development, quantum probably would have been discarded in the pile of ‘nice idea hardware,’ along with many other great computer architectures.



Leave a Reply


(Note: This name will be displayed publicly)