Where Cryptography Is Headed

Security teams are scrambling to devise technology that keeps data secure in an era of virtually unlimited compute power.

popularity

Reports began surfacing in October that Chinese researchers used a quantum computer to crack military-grade AES 256-bit encryption. Those reports turned out to be wrong, but that did little to dampen concerns about what would happen if it was true.

The looming threat of quantum computers breaking today’s encryption, and the stockpiling of encrypted data in preparation for a time when it can be decrypted, continue to haunt the security industry. The misinformation that was repeated by multiple news outlets was merely a precursor to the real thing.

“Work on that type of data structure has been going on in the higher-level crypto analysis communities for a very long time, and this particular paper completely overstated that this new technique was a quantum-based technique, so the headlines just ran with it,” explained Scott Best, senior technical director, product management of Silicon IP at Rambus. “There’s the expression that a lie makes its way around the world 10 times before truth has time to put its pants on the morning. Eventually the cryptographic community said, ‘Everybody settle down.'”

The decryption did not break military-grade AES. “It was talking about this very esoteric structure that the community knows about,” Best said. “They ran some simulations on the D-Wave quantum annealing cryptographic computer and had some interesting results, but it’s nothing that we didn’t know about. And it doesn’t overachieve what we already can do with classical computers.”

For the moment, at least, quantum-resistant cryptography appears to be secure. But the tech world isn’t just sitting around waiting for the next attack. According to experts, quantum-resistant technology has been the most urgently worked on aspect of cryptography in recent years, despite quantum computers remaining enormously expensive and relatively rare. This is likely to remain the case for years, if not decades to come. While large-scale quantum computing has been on the receiving end of tens of billions of dollars in research, its widescale deployment does not seem to be imminent.

A McKinsey report estimates that quantum computers won’t be able to function at scale until at least 2040. But as the saying goes, it’s wise to hope for the best and prepare for the worst.

“In terms of generalized cryptography, post-quantum cryptography is the biggest thing that has happened,” said Mike Borza, a Synopsys scientist. “It’s the idea of being prepared now for something that may happen in the future, and it’s important. If you have a secret that you want to protect now, and you want to protect it for the next 50 years, then you should be using something that’s quantum computer-resistant, because sometime in the next 50 years it’s likely that a quantum computer at sufficient scale to crack conventional cryptography will be available.”

Different solutions are being deployed, many of them resting on new algorithms that have been deemed quantum-secure by NIST. Some experts recommend focusing on larger key sizes. However, both of those developments come with complications, the latter requiring increasing amounts of compute, and the former having some mathematical vulnerabilities. The push to have more security on the edge and in IoT devices requires a shift left in implementing modern cryptography.

Key size a partial solution
When it comes to symmetric cryptography, Borza said making the key size bigger has been the most common approach for a post-quantum world. “If you want something that’s 128 bits secure in a post-quantum-computing world, that means you need a key of 256 bits in size,” he said. “That’s pretty well understood for the conventional symmetric cryptography that we use today, and people are making that change fairly easily because they’re already accustomed to having keys of that size to protect top secret things.”

But whether that’s the best way forward is open to debate. “All AES encryption, all our banking, is secure because of 128-bit keys,” said Prakash Madhvapathy, product marketing director at Cadence. “It’s not factorable and it’s secure. The design of these algorithms is not that easy to do and to make secure. It takes an army of cryptographers and analysts to design them. There are things called differential key attacks, where if you have four related keys that differ by only a few bits with each other — such as AES 256 bit, which has been hacked to the point where they say there is some weakness in the design of the substitution boxes — there is some residual linearity. If you give an analyst four keys that are 256 bits long, but they are very close to each other in terms of the number of the bit, they can crack the whole thing. The compute requirements to hack it now is not in the order of 256, but is now 296, which is worse than the 128 bits. The question is why should they use the 256-bit version when the 128-bit version, in the worst case, is more secure?”

Growing the key size requires additional compute, as well. Rambus’ Best observed that the sheer scale of millions upon millions of secure data connections has necessitated the fine tuning of protocols of cryptographic strength.

“A lot of the work for getting these protocols right was, ‘Exactly what does the key size have to be, and what is the cryptographic strength?’” Best said. “Sometimes cryptographic strength is as it was with RSA. RSA has a 2k bit size key that is less secure than the 4k bit size key, which is less secure than the 8k bit size key. AES 256 is a 32-byte key, which is more secure than AES 28. Sometimes security has that tradeoff of more and more data plus more and more processing power. A lot of the standardization effort over the last couple of years has been the fine tuning of the specification to ensure an effective tradeoff between key size, computation efficiency, and security.”

Finding those optimal tradeoffs has been the result of tweaking relatively minor things, such as byte ordering. As Best explained, “When you deliver a piece of key material, when you deliver the construct of a key, and determine exactly what form public key is encoded in when it’s delivered — and when there are different numbers of cycles, or when there is a different number of strength of algorithm — what type of initial initializations need to occur inside of the memory that is holding onto some of these pieces, such that the actual computation will remain secure?”.

Borza said the trend of ballooning key sizes will continue when it comes to the levels above commercial-grade cryptography, with the same principle likely leading to the development of stateless hash-based solutions, which are the basis of NIST’s FIPS 205, released in August 2024. For him, the real concern is asymmetric public key cryptography. “All of those algorithms do fall to a quantum computer of sufficiently large scale. If you have a secret that you want to protect for 50 years, you already need to be using something other than asymmetric cryptography, or using newer algorithms that stand up to the existence of a quantum computer, which allows you to solve the problem in parallel and effectively really reduce the key strength of a large asymmetric key. That’s the whole focus of these NIST algorithms. You’ve got FIPS standards 203, 204, and 205, and there’s also 206. 205 and 206 are the symmetric-based, resilient algorithms, stateless hatch-based schemes, and the limitation of those is that you can use the keys once for different cipher texts. If you re-use the key across different cipher texts, then you produce what’s essentially a mathematical exposure that allows the solution to be obtained, yet multiple texts that are encrypted with the same thing that are no longer secure.”

More compute resources needed
The introduction of these new algorithms creates its own set of issues. Lee Harrison, director of automotive IC solutions for Tessent at Siemens EDA, noted that the NIST algorithms are a lot more processor-intensive than their conventional counterparts. “They’re a lot more involved than the lightweight algorithms that we had before, because they were kind of in the background and they just worked. These new algorithms take quite a presence.”

That has particularly severe ramifications in areas where DFT is important, such as automotive. That issue is at the heart of a paper published by Harrison’s colleague at Siemens, Janusz Rajski, which details an efficient, scalable cryptographic hash function.

“We did the research in that area because with design for test, the logic and the technology that we add to customers designs can’t be too intrusive,” said Harrison. “They don’t want us to go and add a huge cryptography engine just for test access. Janusz did quite a bit of research in that area to really understand how he could minimize the footprint yet still provide a post-quantum type of algorithm.”

The new FIPS standard comes as more processing is being done at the edge, which is necessitating some tough choices due to the impact on PPA. Because of the extra power requirements, Madhvapathy observed that how modern encryption is deployed needs to be considered early in the design process.

“The only reason to move such complex standards to the edge, in terms of implementation, would be if you have assets that you want to protect against such attacks,” Madhvapathy said. “If such attacks are going to be more expensive than the assets themselves, which you can purchase directly, then the reasons for the attack goes away.”

While beefed up security does necessitate tradeoffs, Harrison urged designers not to think of it as an additional cost, but part of the overall design cost. “The only time it really comes across as being an additional cost is if you add security at the end of the project,” he said. “If security is integrated into your design process, then it’s not costing you anything.”

Crypto at the edge
That shift to the edge also comes as more and more computing is localized for performance and security reasons. While quantum-resistant crypto until now has been mostly reserved for sectors such as automotive and data centers, where cybersecurity is considered paramount, there are indicators that an increasing number of IoT/edge-based devices could soon receive the same consideration.

“When you think about what IoT devices typically are, you may wonder why security is needed for them,” Harrison said. “But looking at what IoT devices are used for, a lot of them safety-critical and different types of applications. So having security within IoT is becoming quite common, and you’ll see this kind of requirement grow over time.”

In Europe, more stringent cybersecurity requirements are being mandated at the government level. This has ramifications for cryptography that go beyond the PQC realm and into key management, key rotation, management of certificates over time, and the ability to update keys. “No longer can you have a device with a key injected at fab and that is not managed afterwards,” said Sylvain Guilley, co-founder and CTO of Secure-IC.

Effects on design
The need to adjust compute for more intensive algorithms already has begun to have real-world effects on design.

While cryptography is a complex issue for SoCs, Harrison said it presents a new set of problems at the chiplet level, where engineers must figure out how to provide security for a heterogeneous system. “Does it make sense to overload that kind of algorithm processing on every single chiplet, or would it make sense to have a single common cryptography engine on the device somewhere that all of the chiplets could access? The challenge is there’s no universal standard around chiplet interconnect. So as much as it’s a good idea at the moment, if you bundle a load of chiplets together, the chances of them all talking the same language is very remote.”

The complexity of cryptography also has led to workflow adjustments. Guilley said it has become imperative to include an architecture team that will consider certification and security requirements early in the process. Adding on a security team with subject matter experts is also integral for turning requirements into functional feature.

Conclusion
The looming post-quantum era has led to new algorithms that are very compute-intensive. In turn, this has led to new ways of thinking about the tradeoffs involved in incorporating security into designs. This is especially important when it comes to applications like automotive, where DFT is critical.

While some experts have seen larger keys as part of the solution, others point out that turning to bigger keys can lead to other vulnerabilities. On top of that, advanced cryptographic techniques present unique challenges for chiplets, where the lack of a universal interconnect has made a common cryptographic engine an impossibility at present.

Related Reading
Post-Quantum Computing Threatens Fundamental Transport Protocols
Transport Level Security relies on the encryption algorithms that PQS puts at risk.
Quantum Computing Challenged By Security, Error Correction
Cryptographers scramble for better security schemes, while physicists try to fix qubit errors.
Data Leakage In Heterogeneous Systems
What’s needed to secure data across multiple chiplets and interoperable systems.



Leave a Reply


(Note: This name will be displayed publicly)