The price of securing a chip is going down. Here’s why.
For years, chipmakers have marginalized security in chips, relying instead on software solutions. Eventually that approach caught up with them, creating near panic in a scramble to plug weaknesses involving speculative execution and branch prediction, as well as the ability to read the data from chips with commercially available tools such as optical probes.
There were several reasons for this inaction. First, most attacks do come through software or the network, or from some low-level employee. But hackers also can take control of the hardware by way of the software, and there are examples ranging from Stuxnet to the Mirai botnet DDoS attack. And while hardware should be able to reboot and isolate the problem, that wasn’t the case in the past because most security was layered on or perimeter-based, rather than designed in from the start.
Second, security is expensive, and consumers typically will opt for the cheaper solution rather than pay extra for a more secure version. That cost can be measured in dollars for security modules, but there’s also overhead costs in terms of performance and power. Active security is more effective than passive security, and it takes power and engineering time to architect this into a system. The fact that smart phones now come with multiple levels of security, and that automakers are focused heavily on security, shows that security is being taken much more seriously these days than in the past.
And third, device scaling all the way down to 10nm (or so) was the first choice of many chipmakers because of the proven route to yield and working silicon. But as Moore’s Law winds down, and as scaling becomes much more expensive and less attractive from a power/performance standpoint, companies are turning to more heterogeneous architectures. Along with that, they are developing all sorts of customized accelerators that do one thing very efficiently, and that includes security.
New architectures make it far easier to insert security at multiple levels in a single chip, or within a multi-chip package, and specialized algorithms and in-circuit monitoring now make it possible to identify any unusual activity for very little power and added cost. In the past, active security solutions typically required one or two cores of a multi-core processor, which limited how fast those devices could run. Today, that security can be developed separately and utilized across a wide range of devices.
Over the past couple of process nodes, new architectures, best practices and a variety of tools, such as formal verification and software linting, have been developed for a wide variety of applications involving security. With some effort, it’s now possible to build a highly secure device that makes it very expensive to successfully attack, difficult to track from the outside — even at the most advanced nodes, where electromagnetic readouts are often visible with the right equipment — and able to identify when an attack is occurring and how to block it.
In addition, new heterogeneous architectures allow critical code to be isolated in tamper-resistant segments, which shut down when they are attacked. From there, a system can be rebooted and cleaned, rather than having to try to salvage whatever data is left.
Security largely has been out of sight for most engineers, but significant progress has been made on this front. There are still security risks and plenty of holes, and thieves are both persistent and increasingly well-funded and trained, but the good news is strategies for combating attacks are evolving, as well. It will never be perfect, but from a security standpoint that’s at least a few steps in the right direction.
Related
Fundamental Changes In Economics Of Chip Security
More and higher value data, thinner chips and a shifting customer base are forcing long-overdue changes in semiconductor security.
What Makes A Chip Tamper-Proof?
Identifying attacks and protecting against them is still difficult, but there has been progress.
Making Sense Of PUFs
What’s driving the resurgence of physically unclonable functions, and why this technology is so confusing.
Leave a Reply