The race is on to find and implement a public-key cryptographic algorithm that will stand up to the challenges posed by quantum computers.
The US Government just stepped up the push for quantum computing with an award of $625 million in funding to create five quantum information research centers. Industry and academic institutions will contribute $300 million toward this effort with the remainder drawn from the $1.2 billion earmarked in the 2018 law: the National Quantum Initiative Act. The race to quantum computing is a global one, and this is just one sign among many of the seriousness with which the many participants are pursuing the goal. The upshot is that we may see fully realized quantum computers within the next decade.
But what is quantum computing and what is its promise? Today’s computers run on bits of data: either a 1 or a 0. Quantum computers use qubits, which can be in a quantum superposition of both states—meaning they can simultaneously be both a 1 and a 0. Quantum computers have many entangled qubits; entanglement being a quantum mechanical phenomenon in which the state of two particles are correlated, even when separated by a large distance. Changing the state of one qubit will instantaneously change the state of its pair in a predictable way. Harnessing this quantum phenomenon can lead to massive, even exponential leaps in processing power, depending on how many qubits are in the computer.
For cryptography, we depend on the fact that digital computers will take hundreds or thousands of years to solve the “hard mathematical problems” at the foundation of the cryptographic algorithms which protect secret or personal data. For symmetric key cryptography such as AES, where both endpoints share a key ahead of time, the advent of quantum computing doesn’t change matters.
However, for public key cryptography, such as RSA and ECC (Elliptic-Curve Cryptography), quantum computing represents an existential event. A fully developed quantum computer using Shor’s algorithm, a polynomial-time quantum computer algorithm for integer factorization, will be capable of cracking a 2048-bit RSA implementation in perhaps as little as a few days. Since so many secure applications depend on the scalability of public key cryptography, this is an extremely serious issue.
Work is well on its way to define Post Quantum Cryptography (PQC). The National Institute of Standards and Technology (NIST) is sponsoring a competition to find, evaluate and standardize a public-key cryptographic algorithm (or algorithms) that will stand up to the challenges posed by quantum computers. Now in its third round of finalists and alternates, the final portfolio of PQC algorithms is expected to be announced sometime in 2022. Common among the finalist algorithms is greater computational intensity, larger key and cypher text sizes, or all of the above, than today’s public key algorithms.
Designers will need time to implement the chosen algorithm standard(s) in their products, and that lead time can be as much as a couple of years for new chips and devices, and up to ten years for networking infrastructures and networking protocols. It will also take many years to upgrade and deploy existing computing and network hardware on a broad scale. Secure endpoints (everything with a network connection) will require upgrading, which in some cases may mean new hardware, as software will not be fast or secure enough to process the new PQC algorithms. The impact on network architecture and infrastructure, which is highly tuned for network efficiency, will be significant, due to the larger keys and ciphertext. This too may entail significant upgrades or replacements.
As an illustration, consider key sizes. Current encryption and signature algorithms have keys that are a few hundred or thousand bits long. Some of the proposed post-quantum algorithms have key sizes of several tens of kilobytes up to a megabyte in size. This means we need to be able to store these keys efficiently; when the public keys are used in public key infrastructure certificates (PKI) and need to be communicated or stored locally on the end device. This will require more bandwidth and memory, and bandwidth requirements will likely increase even more when using those schemes that have larger size ciphertext.
You’ll note that the time to deploy PQC overlaps with the timeframe we anticipate quantum computers to be fully realized. So, while we have time to prepare, we may have little or no buffer. PQC is a field of great study at Rambus, and we can help customers today strengthen their designs against future attacks. Solutions such as our CryptoManager Root of Trust, anchors security in hardware and offers programmability to incorporate new functionality as a means to futureproof designs.
Quantum computing is a goal being pursued across government, academia and industry with tremendous energy. To ensure that we can keep data safe, we’ll need to pursue PQC with equal vigor.
Additional Resources:
White Paper: The Road to Post Quantum Cryptography
Blog: Post-Quantum Crypto and Rambus
Leave a Reply