中文 English

Why Data Is So Difficult To Protect In AI Chips

AI systems are designed to move data through at high speed, not limit access. That creates a security risk.

popularity

Experts at the Table: Semiconductor Engineering sat down to discuss a wide range of hardware security issues and possible solutions with Norman Chang, chief technologist for the Semiconductor Business Unit at ANSYS; Helena Handschuh, fellow at Rambus, and Mike Borza, principal security technologist at Synopsys. What follows are excerpts of that conversation. The first part of this discussion can be found here.


(L-R) Norman Chang, Helena Handschuh, Mike Borza. Photo: Paul Cohen/ESD Alliance

SE: In AI systems, what happens if the training data isn’t as good as you thought it was? It can look very different by the time it gets to the inference stage?

Borza: Those models are being updated on a regular basis, and you have to ensure that’s being distributed securely. To take best advantage of this, you want to connect AI’s that are looking at different behaviors. Then you can start to track the movement of malware through different sites or between different locations. But that goes back into the new training data, which is going to update the model. And so you get into this feedback loop. It’s very slow feedback loop, and it’s one in which the AI depends on you being able to securely distribute information and collect information. And then you want to use that same model that’s being generated—which you have to ensure the integrity of—to make sure the system and networks are now behaving as you intended. And so you’ve got this interplay of security with the AI, which is buying more security.

Chang: During the incremental training of image data, if you’re driving on the road and you perturb the image a little bit, suddenly you cannot recognize anything and a red light will become a green light. That could be very dangerous. How are you going to stop that? It’s very difficult.

Handschuh: There’s one paper that came out recently that goes even further, showing that any system can be perturbed in a fixed number or logarithmic number of changes and fixes. There is no way you can stop somebody from corrupting things enough that it will show something completely different. We really have to think about that.

SE: As we start building systems for data throughput, the goal is to get everything working together on a chip at blazing speeds. Does that make it harder to isolate where the data is being processed, and does that create security issues?

Borza: It changes where the security boundaries are and where they need to be. It’s very difficult to verify the behaviors of complex things, which is why security people tend to favor things that are small and isolated and contained. That’s totally at odds with what the AI people want to do, which is to essentially distribute computing and memory over the chip so that it mimics the structure of a brain or some behaviors that we see in the brain. But it creates this problem where now you have a lot of data that’s distributed around, and the integrity of all of it is important to the correct behavior of the system. The sensitivity of some of the network training algorithms is incredibly high. Very small perturbations or errors in data can generate hugely wrong results. It’s not a linear relationship.

Handschuh: The big question in AI and machine learning and deep learning is how do you secure that data and know that it can’t be corrupted or stolen? How do you secure the models, and how do you keep the weights from being published? That’s somebody IP, and it’s a valuable asset. So how do you secure the model itself, and how to you involve it in a secure way? And then, how do you protect the computation from an input once you train a device and the network is set up, and then the input comes in and the result is revealed? These are all the different points that your system has to take care of. It’s a lot of work.

Chang: It’s also a system issue. You may have GPUs, as well as an FPGA for the core processor for the AI system. So the security really goes beyond the single-chip packages. It has to consider security as a whole system.

SE: But the whole architecture is in motion here. As you download new updates, and as these algorithms begin changing because we want the systems to adapt, does the electrical and power profile of these chips begin to change? And do you have to change what you’re looking out for in terms of aberrations in the signal patterns?

Chang: That’s definitely a side channel attack issue. After you update the OS and other parts of the system, that will change the profile of the attack on the power noise and also the power calculation. And so you probably need to run a simulation for the measurement using the remote systems to make sure that the new updates and the design of a new component in a system are enough to execute a security sensor.

Handschuh: The power profile will change. It will look different. But you need to find come up with some way of securing things that don’t depend too much on what your software does differently. The basic building blocks need to be written and implemented in a way that they don’t leak by themselves. And then software that you download will hopefully run on a processor or a building block element that is not leaking anything. In that case, the software should be able to run securely and not give away too much information. But that’s that’s the next step — securing processors such that it doesn’t matter what software you run.

SE: Is the solution to build in security as part of the regular design? One of the big issues for security is that people don’t want to pay for it. There’s also overhead, particularly when you have active security, and it has to be updated regularly.

Borza: There are attempts to do that, such as Arm’s PSA (Platform Security Architecture), which the company says makes it easy to develop and integrate. They’re making a big effort and putting a lot of energy into doing that. So that’s an approach that may work. We still are from the mold that people are going to build security into their products because they have a need to, and they are willing to pay for a solution that makes sense. But it does add cost at several levels. There’s the initial cost of acquiring the IP. There’s also the cost of developing and maintaining software for it. And then you have the fact that it costs chip area, which is overhead, and that drives up the cost of the chip, as well. But we’re really getting to the point where security can’t be ignored, which is what most people have tried to do for years.

Handschuh: Adding security doesn’t happen by chance. In some cases it requires legislation or standardization, because there’s liability involved if things go wrong, so you have to start including a specific type of solution that will address a specific problem. Liability is what’s going to drive it. Nobody will do it just because they are so paranoid that they think that it must be done. It will be somebody telling them, ‘If you don’t do it, here’s the risk for you and for your business.’ Then you will start seeing not specific solutions, but areas where you need to have some sort of trust in the chip, whether that’s PSA or Common Criteria or something else, and you will have to prove you’ve done something specifically to check the box.

Chang: There so many devices that are connected to the network that are not secure. So where will all of this start?

Borza: You’ll start seeing this in Europe. NIST (National Institute of Standards and Technology) has good ideas, as well. And then there are consumer protection laws, which were used to regulate the behavior of the router providers. It is no longer acceptable to ship every router with admin/admin as the user/password combination because people will just connect them to the Internet.

Handschuh: Europe’s focus has been more on the privacy side of things. There is also the Cybersecurity Act in the United States. So there is a push from the regulators to say, ‘We need to do something.’ They don’t ever tell you exactly what needs to be done, but there is some traction.

SE: Are standards the right way to solve this, or is this process too slow? Hackers move quickly and share information. Does this need to be built into the architecture?

Borza: That’s fine for people who have the wherewithal to figure that out, or who are willing to pay an expert to come in and help them design a system or a chip.

Chang: Do you just drop an AES block into a chip?

Borza: Far from it. Encryption is a tool. It’s a method of providing certain guarantees, mostly around confidentiality, but it also can be used to provide integrity. AES is not by itself security. Encryption can be used to build secure systems.

SE: The point at which data is most vulnerable is the point at which encryption and decryption occur. Is it possible to secure it at that point?

Handschuh: It’s necessary. When the data is at its most vulnerable, it has to be protected within some secure boundary and within some trusted execution environment or security flavor. At that precise moment when it is encrypted or decrypted, it has to be in a place where nobody has access to it. There’s a way of designing things where you can make sure people have certain access rights and handles to do things with the data and with the asset, but can never actually access it directly or export it. You can handle it like chemist, where they never touch poisonous materials directly, but they can move them around through manipulators. So you can give certain user rights to do things with data and assets, but no one can directly touch them.

Chang: During the encryption there are various phenomena that occur, and the most severe one from a security standpoint is through the power supply noise. You can monitor the last encryption. If you do this for 1,000 or 2,000 cycles, you can decrypt the keys using a simulation methodology. If you have AES encryption embedded into the chip, along with other functions, it will be slightly more difficult, but still possible to detect 3,000 or 4,000 cycles. But if you move to a chip-package system and you are trying to monitor electromagnetic emissions, there is other noise coming from the system.

SE: So the signal-to-noise ratio is lower?

Chang: Yes, it’s smaller.

Borza: And signal processing is all about resolving signals from noise.

Chang: Yes, it’s a statistical technique.

Related Stories
Security Tradeoffs In A Shifting Global Supply Chain
How many simulation cycles are needed to crack an AES key? Plus, the impact of trade wars on semiconductor security and reliability.
New Approaches For Hardware Security
Waiting for secure designs everywhere isn’t a viable strategy, so security experts are starting to utilize different approaches to identify attacks and limit the damage.
Semiconductor Security Knowledge Center
Who’s Responsible For Security Breaches?
How are we dealing with security threats, and what happens when it expands to a much wider network?
Can The Hardware Supply Chain Remain Secure?
The growing number of threats are cause for concern, but is it really possible to slip malicious code into a chip?



Leave a Reply


(Note: This name will be displayed publicly)