Systems & Design
SPONSOR BLOG

Are Value And Security Needs Misaligned In The IoT?

Are Value And Security Needs Misaligned In The IoT?
The greatest vulnerability is at the least complex point of entry, and there will be billions of them.

popularity

Today’s keynote given by Green Hills Software CTO David Kleidermacher here at Embedded World in Nuremberg continued on the security thread from last year and was—interestingly enough—titled like a blog post I wrote about the Amphion Forum in late 2012: “Securing the Internet of Things”. Unfortunately, security has not become less scary. In fact, it’s the opposite.

David started his keynote by pointing to all the items he had on him reflecting the Internet of Things (IoT), from his Google Glasses to manage the presentation, to his smart watch and his smart phone. David called the IoT the “fourth wave” of computing, after the digital computer that started in the 1940s, the Internet in the 1970s, and Cloud Computing of the 2000s. He made the scary prediction that by the end of this decade we will have 30X more connected “things” than people on this planet.

schirrmeisterart

The Centralization Myth—David Kleidermacher, Green Hills

David then asked the audience where they thought the biggest issues regarding security would be, at the central location where the data resides or at the “things” connecting to it (see photo). He introduced this issue as the “centralization myth.” David explained a bit more about the data breach at Target that exposed personal data of more than 110 million users. Apparently, the breach took place in the point-of-sale terminals and not the central storage. Just two hours earlier I had joked with a group of 12 editors that security is important to me because I do not want my mom to see the data that my step-counting and sleep-recording FitBit wristband would record for me on a lazy day. Good thing that she is not a hacker! ☺

David proceeded to explain the “Principles of High-Assurance Security Engineering” (PHASE):

  • Least Privilege: The practice of limiting access to the minimal level that will allow normal functioning. As an example, David showed a screenshot from the National Vulnerability Database (NVD), in which Adobe Flash Player got a bad rating because when taken over in the context of Android it would allow the culprit to hack into Android.
  • Componentization: Instead of having one monolithic bulk of programming data, create components on a micro kernel for which security issues could be limited locally. He quoted UNIX’s Ken Thompson saying in 1992 that he “would generally agree that micro kernels are the wave of the future”.
  • Minimization of Complexity: Going together with componentization, the example David gave here was Apache and the question how he could design a secure web server. He downloaded the code, looked at it including its plug-ins and decided that it was far too complex to decide whether it is secure or not. Instead he put a team on it that would limit feature complexity to exactly what is needed, but would be secure.
  • Secure Software Development: David referred to his book “Embedded Systems Security,” pointing out that the development process for software itself has to be secure.
  • Independent Expert Validation: This was definitely the most entertaining one, because David brought up pictures of old “Snake Oil Liniments,” which immediately made clear why an independent validation of health benefits should be governed by the FDA. David likened containers—i.e., apps with security around them—to be the snake oil of today’s mobile world. His reasoning was that they are running on Android, which is not secure in itself. He gave the impressive example of how he put a team on hacking into a least privilege violation at the bottom, from which they were able to see everything above it, such as the clear text stored in memory from an email app running in a container. Scary.

So is it time to give up our devices and abandon hope for security? Far from it! David suggested a platform architecture for the IoT that can be safe. Instead of having a software stack from Driver to Operating System (OS), OS APIs, and the apps running on it, he suggested a software architecture that uses virtualization of the hardware and the software at both ends of the stack. At the very bottom, there would be a hypervisor (like Green Hills Integrity) enabling independent virtual machines for two different OSes. Then a Web API on one of the OSes using remote procedure calls (RPC) would allow applications to access the Internet to the outside world, and a OS API on a second OS, that only uses inter-process calls (IPC) between applications and as such is secure. With this approach, his phone can run all the less secure applications like internet browsing, photos, etc., on one OS, and all the applications that require full security—like company email—can run on a independent OS that is not exposed to breaches on the other OS.

While reflecting on the keynote later, I realized that the picture may be even more complex. In my world, for example, there is a hub between the “thing” (i.e., my FitBit wristband) and the actual data storage (at FitBit’s servers). The hub in-between is my cellphone collecting the data and sending it to the final data storage. That leaves an additional path for a hacker to collect data.

Going back to the title of this blog, if I look into the monetization of such an IoT system, from a system development tool’s perspective, security, perceived value, and complexity do not seem to align. The tools of the System Development Suite including emulation are a practical “must” for the hub products—the application processors running Android, iOS and Linux—because the dependency of hardware and software is very big. The data servers use system development tools for hardware verification, but the actual software “brains” are in the applications running on Linux and doing Big Data Analytics. At the “thing” side, the dependencies between hardware and software may even be bigger than at the hub side, but the software itself is less complex. The hardware design itself is also more focused on microcontrollers and, as such, is less complex, conceivably requiring less complex development methods. Security is important at all levels, but needs to be considered especially on the “thing” side—as David had pointed out—because this is where vulnerability is especially high.

The good news for system development tools like emulation, acceleration, FPGA-based prototyping, and virtual prototyping is that they can be huge help in exactly ensuring the security aspects with early hardware/software simulation. I fully agree with how David closed his keynote—security aspects have made the time in embedded systems development a lot more interesting!



Leave a Reply


(Note: This name will be displayed publicly)