Who’s Responsible For Security?

Experts at the Table, part 3: How to manage the cost of security; the value of passwords; insignificant versus real threats.

popularity

Semiconductor Engineering sat down to discuss security issues and how to fix them with Mark Schaeffer, senior product marketing manager for secure solutions at Renesas Electronics; Haydn Povey, CTO of Secure Thingz; Marc Canel, vice president of security systems and technologies at Arm; Richard Hayton, CTO of Trustonic; Anders Holmberg, director of corporate development at IAR Systems. What follows are excerpts of that conversation.


L-R: Anders Holmberg, Marc Canel, Mark Schaeffer, Haydn Povey, Richard Hayton. Photo credit: Brian Bailey

SE: If something is manufactured in one place, shipped somewhere else for assembly and then shipped to a distributor, how do you know what you’re getting is the real thing?

Hayton: We started this with mobile phones, and because we enable payments on mobile phones as a trusted execution environment, people need to know that it’s a trusted device. We do ‘trust injection’ on a lot of different devices. But that’s pretty simple. We trust whoever is making the phone, and they do it all themselves. IoT isn’t like that. You outsource to module makers. In some markets you get a hologram. We’re doing something similar with digital holograms, whereby we can track different stages during manufacturing. ‘Here’s a sticker to put on this to say this device is at this stage, so later I can say it’s gone through stages A, B, C and D in the right order.’ But it doesn’t tell you if those people have actually done the right thing. You have to have external mechanisms to make sure that ‘Factory A’ did a really good job. That’s going to be industry by industry. For some people, if you’re doing a medical device or an aircraft part, you’re going to have a strict control. But even with mobile phone manufacturers, the difference between a large company and a small-volume manufacturer in China, which is making 10,000 devices, is miles apart in terms of process and their ability to execute a secure process. In IoT, it’s going to be a hundred miles apart.

Schaeffer: In silicon we’re going to start to see identities on possibly every part. But you have to create a scalable model. If you have a very-low-power piece of silicon, a PUF might be really good for that if you don’t already have a crypto engine on it. But in higher-end chips, you’ll increasingly see crypto engines. This is almost like floating point engines 20 years ago. They used to be separate. Now they’re embedded into chips. And you’ll see crypto engines in there, mainly for price/performance reasons, especially for asymmetrical encryption. If you’ve already got a crypto engine, you already have key storage. That allows you to add an identity, but to do that you need certificates. So how do you deploy certificates? There are a variety of ways to do that. You can use certificates and key databases. But now people need to understand this whole infrastructure, and it’s starting to get very complicated.

Povey: We’re working with companies like Data-IO to enable the OEM to deploy certificates and identity material at the point of manufacturing. You have to get the identity in there early with a digital hologram or certificate. You have to ensure that is done at the birth of the device, whether that is in the fab or soon afterward in the distribution channel. But how do you manage that in an untrustworthy environment, given that you can’t necessarily trust contract manufacturers or even programming houses? Many of them are very good, but you don’t know the individuals on the production line. So you have to have a cryptographically enforceable mechanism to create the certificates at the OEM to define how many devices you want to have created, where you have them created, and how you want to manage them. But you also need to enforce that in the distribution channel and the supply chain. There is a lot of technology we can leverage with end-to-end cryptography and hardware security modules with very high-grade, tamper-resistant security to enable the OEM to really have an impact on the supply chain. The interesting thing is that contract manufacturers are welcoming this capability. It’s not that they feel they’re not trustworthy. But they don’t have to take on the liability or the same level of auditing. They don’t want to have people’s codes or credentials. It has to flow through the system. It has to be measurable and enforceable.
Schaeffer: In order for this thing to work, do we suddenly need tamper resistance in every piece of silicon? I don’t think that’s feasible. The alternative is PUFs, but this will evolve over time. We need to drive down the cost.

Canel: You’ll see a fragmented ecosystem with security. You’re going to have solutions with no root of trust—sensors that are 10 cents or 15 cents embedded into a building when that building is being put together. You’ll see devices with parts that cell for 50 cents. And then you’ll see parts with tamper resistance and security, and those will sell for $10. You’ll see a spectrum of solutions. We will have to deal with a set of inconsistent environments. There will be vertical markets that are fully regulated and companies that are concerned about their reputations, and we’ll see some level of normalization of practices.

SE: But there are a lot of touch points along the way, from the IP developers to the manufacturers to the distributors. Doesn’t each one require the same level of scrutiny? And is that realistic?

Schaeffer: It doesn’t need the same level of scrutiny. There can be some secure mechanisms put into place, but you have to choose the right tool for the right job. If I can get one key and use that to penetrate one device, that’s fine. That’s not a scalable attack. If I can use one device to get a key to penetrate 1,000 devices, that’s not alright. That will need either more protection or a secure mechanism. But we can have a set of devices where you have the same level of security. Then it will be up to the OEM to determine which technique is appropriate for a particular application.

Povey: Instead of who takes responsibility, we have to change the math. We have to take security and turn it from a cost into an intrinsic value—an underpinning enabler for the system. Then, everybody will make the right decision. Instead of saying, ‘I’ve got to put a root of trust in this and it will cost me 30 cents,’ or ‘I have to write better software and it will cost me $1 million,’ you are building a strategic relationship with your customer who you can sell to again and again. You deliver updates, management services and high value. That becomes part of the purchasing requirement. You can do better analytics because you trust the data and derive value out of that. And if you can defend your brand, that’s valuable. The only way to make this work is to change the business models.

Canel: Along those lines, insurance companies are going to play a very important role in a lot of industries.

SE: How much of this is the consumer’s responsibility? You have to change passwords regularly, which most people don’t.

Hayton: Passwords are terrible, but that’s a narrow view of security. You’re putting your password into a device and connecting to some multinational corporation that is awake every minute of every hour. Shouldn’t you be more concerned about that than whether your password is good or not? Typing in the password has been a long time in dying, but it is beginning to die. I’ve got a fingerprint sensor in my phone and another one on my PC. There are better ways to log in. You can’t trust users to come up with good passwords and then not forget them. The scariest credential we have is our e-mail address. If you don’t have a good password for your e-mail, go change it now. With any system you use, when you forget your password they will send you an e-mail. So if someone hacks your e-mail, they can hack everything else as well. That’s a security issue that happens because it’s convenient. Your e-mail account is now the center of all your banking, medical records and insurance and everything else. That’s a classic thing that happens again and again in security—one system is used outside its original purpose. Even if it was sufficiently secure to begin with, it’s no longer sufficiently secure.

Schaeffer: The question is who bears the responsibility if you do something stupid like that. If someone steals money from my bank account from my Gmail account, who’s liable?

Povey: Most consumers don’t care and shouldn’t have to care. So many of the devices on the IoT won’t have user interfaces. We can’t even enter a password. For a smart home, you do have to enable services to help manage services and to maintain them. A lot of consumers don’t update their PCs, and they’re easy to update. They certainly won’t update their heating systems, their cars, their toasters. You have to outsource that to a trusted third party. In the industrial IoT, that has to go back to the info team to manage. There are better ways for IT teams to manage large, complex systems. Patching coming from the OEM has to be backed up and deployed at a certain time when the system is in the quietest mode. We can move away from passwords there and to certificate-based technology. We have to go back to better security technologies, which can be managed through good systems.

SE: This has ramifications that go well beyond a device. A toaster may seem insignificant, but it can burn down a house, and lots of toasters can burn down an entire community.

Povey: People are putting big lithium batteries in their house to take them off the grid. That’s great from an energy distribution standpoint, but it’s also a massive attack threat. If you can overcharge batteries in people’s houses, you’ll have fires and people will die. There is legislation around that and liability. It’s a major headache. But people don’t know where to start securing that.

Canel: Consumer companies are taking far more ownership of the security for devices. They understand the consumer has protections, and security is something they have to build into the product. They will have to manage the lifecycle of the products. We will see a lot of the consumer brand names taking more responsibility. But it also may mean that they lock us in their brand of walled gardens.

Schaeffer: It’s amazing how many people don’t understand the relationship between security and ease of use. You need to do something when you connect your washing machine to your toaster to your cloud. It’s up to the system vendors to make it really easy with an app and an infrastructure without getting caught in that a walled garden.

Holmberg: A lot of what has been discussed here involves infrastructure. When you think about software, that has to be developed in a different way. That starts with the initial development where you consider security requirements. We can learn a bit from the functional safety community. They’ve been thinking about safety for many years, and they have processes for that. We can do the same for security in the software part of the world. What happens if my customer’s identity is leaked? What if this heater goes on or off at the wrong time? What happens if this battery is overcharging? It’s both a safety and a security issue, and it involves software running on top of this security infrastructure.

Schaeffer: The FDA is starting to move in that direction. They know safety. Security is part of that.

Related Stories
Who’s Responsible For Security?
Experts at the Table, part 2: Cheap components contaminating the supply chain, the need for platforms and certifications, and the futility of trying to future-proof devices.
Who’s Responsible For Security?
Experts at the Table, part 1: Where security is working, where it isn’t, and what to do about it.
Imperfect Silicon, Near-Perfect Security
Physically unclonable functions (PUF) seem tailor-made for IoT security.



Leave a Reply


(Note: This name will be displayed publicly)