The Battle For Post-Quantum Security Will Be Won By Agility

Don’t wait for quantum resistant cryptography standards to be developed.

popularity

By Thomas Poeppelmann and Martin Schlaeffer

Due to their special features, quantum computers have the disruptive potential to replace existing conventional computers in many applications. They could, for example, calculate simulations of complex molecules for the chemical and pharmaceutical industry, perform complicated optimizations for the automotive and aviation industry, or create new findings from the analysis of complex financial data. At the same time, quantum computers also raise a lot of security concerns, and while today they don’t have real world applications, their capabilities are expected to grow significantly over the next 10 years. According to Michele Mosca, there is only a 14% chance that RSA2048 will be broken by 2026, but that grows to 50% by 2031. The security community has taken notice and is already preparing for quantum attacks.

The challenge is we don’t yet know which will become the standard, but thanks to Mosca’s Theorem, we know we need to start preparing now. According to Mosca, when X+Y>Z (where X is the shelf life of your data; Y is the time it will take to transfer your data; Z is the number of years until stable quantum computers become available), it is time to worry. In other words, that 10-year runway we think we have is actually significantly less.


Source: Cybersecurity in a Quantum World: will we be ready?

An additional challenge is that we don’t yet understand how powerful subsequent generations of quantum computers will be. This makes it impossible to be completely certain if a specific approach will work or how long it will last. The key will be cryptographic agility.

As laid out in Mosca’s Theorem, there are many data sets that exist today that, due to company data retention policies or regulatory requirements, will need to be kept confidential and protected against manipulation past the time we expect quantum attacks to be a threat. For that data, it’s not possible to wait for quantum resistant cryptography standards to be developed.

The National Institute for Standards and Technology (NIST) started a Post Quantum Cryptography (PQC)  standardization process in 2016 to prepare quantum resistant crypto algorithms. Since most symmetric primitives are relatively easy to modify in a way that makes them quantum resistant, efforts have focused on public-key cryptography, namely digital signatures and key encapsulation mechanisms. Now more than six years into the process, there is discussion whether the process related to digital signatures should be opened up again or needs more time.

For some applications, hash-based signatures, which are ready and already standardized can be a good solution. They are regarded as very secure and quantum resistant, due to their one-way function, and may be used to achieve long term integrity and authenticity of data. Due to this attribute, the secret key is very well protected, as obtaining the secret keys from a public key would require breaking of a modern cryptographic hash function – a well understood and very hard problem. Hash-based signature algorithms consist of large hash trees and hash chains, making them ideal for a number of applications, such as firmware updates. It is computationally expensive to build a hash tree, in which each key (known as a leaf) is assigned a secret key, and every non-leaf node is labelled with the cryptographic hash of the labels of its child nodes. Additionally, because each leaf is a unique private key for each sign operation, it can only be used once. Thus, the larger the number of keys that are needed, the larger the tree. This has a direct and significant impact on how long it takes to generate the required number of keys, adds computational expense and can even impact network performance. That said, once built it provides an efficient algorithm for signing and leads to fast verification times.

For applications that aren’t well-suited for hash-based algorithms, there is currently no standardized algorithm that could be used out of the box. Companies must continue to develop devices and need to include cryptography as part of the security mix. The question these companies face is what algorithm should they choose when it’s unclear:

  • What the PQC standard will be
  • How that standard might need to change as classical and quantum attacks become more powerful

Any post-quantum cryptography (PQC) algorithm we create today will likely become obsolete  eventually. We’ve seen it happen before. One example occurred in 2017 when Google, along with CWI, announced a public collision in the SHA-1 algorithm – a deathblow to the one-time most popular hash algorithm. While the breaking of an encryption is scary itself, researchers generally have a pretty good idea of when we’re getting close to breaking it and developing a stronger alternative in time. What’s more concerning is that cryptographic algorithms are deeply integrated into the system and it can be extremely hard to move on to a new scheme – a fact that may leave systems vulnerable.

That was the scenario that played out in 2004 when the MD5 algorithm was broken. It took four years until researchers demonstrated a practical attack, four more years until a large-scale attack was carried out in practice. Nevertheless, two years later some companies reportedly forced their clients to transition within a few months. In total, it took more than 10 years from the time MD5 was first widely known to be vulnerable to remove it from use because the transition was so painful it took a forcing function to make it happen.

That’s where crypto agility is gaining interest. It is essentially the ability to remove old broken algorithms and plug in new stronger versions.

The effort to achieve post quantum security will rely on currently available quantum resistant solutions, such as hash-based signatures for firmware updates, as well as new technologies. As we look to the future of a quantum computing world and start preparing for it, it will be even more important to embrace crypto agility. There will be legacy systems running a certain set of non-quantum safe algorithms. At some point in the future, there will be new algorithms that have to be integrated. For a system to survive this migration, it needs to be designed in a way to be crypto agile, where the crypto functions are separate blocks and where they can be exchanged. Without crypto agility we would be faced with a reality where systems would essentially have to be disposable – where every time a crypto algorithm is broken, we throw them away or never update them. This isn’t a realistic option for complex and expensive systems.

As with most things in life, what a post-quantum world will look like is filled with uncertainty. Uncertainty on what PQC standards will be made. Uncertainty on how powerful quantum computers will be. And uncertainty on what new discoveries will be unlocked. One thing that is certain – security must be kept up to date to meet continuingly advanced attacks and the only way to do that is by embracing crypto agility. This is leading forward-thinking semiconductor and computing device manufactures – who take security seriously – to invest in critical technology now, which will enable their customers to address Mosca’s theorem and the challenge of quantum technologies.

Thomas Poeppelmann is a lead principal engineer at Infineon Technologies.
Martin Schlaeffer is a principal engineer at Infineon Technologies.



Leave a Reply


(Note: This name will be displayed publicly)