Focus Shifts To System Quality

Why the move toward software-defined hardware is significant, but not enough.

popularity

For the past decade, many semiconductor industry insiders predicted that software would take over the world and hardware would become commoditized. The pendulum seems to have stopped, and if anything, it is reversing course.

Initial predictions were based on several advantages for software. First, software is easier to modify and patch. Second, universities turn out far more software developers than electrical engineers. And third, software can be written for any hardware platform to make it behave differently and often better.

Hardware isn’t without its advantages, of course. Building functionality into hardware improves performance and uses less energy. It’s also inherently much harder to breach from a security standpoint. But fixing problems in hardware is difficult, and often impossible if there isn’t a workaround in software, and hardware failures are frequently device killers rather than just inconveniences.

Privately, however, many in the software industry are beginning to recognize that there are advantages to each, particularly for embedded software. To begin with, there is the security issue. In most cases, it’s not the hardware that causes the problem. It’s the software or the firmware. The only way to actually tap into the hardware is through a side-channel attack. That requires physical access to a device, and if it’s been engineered correctly, it only allows access to that particular device. Or it turns to a useless piece of silicon when someone tries to tamper with it.

Software is a different story. It can be attacked remotely and repeatedly. While commercial software developers have done a superb job in making sure that over-the-air updates are authorized and that product keys are specific to a piece of software, that kind of authorization is much less stringent when it comes to software embedded inside of devices. Until the Mirai attack last October, that kind of concern wasn’t even on the horizon.

Mirai targeted default usernames and passwords in IoT devices, many of which were never updated because it was too expensive and time-consuming. The attackers then assembled those devices into an army of connected bots, or botnets, and used them to collectively cause distributed denial of service attacks across a wide range of high-profile Internet sites.

Both hardware and software are vulnerable to failures, of course. In the case of software, bugs may be more of an inconvenience than a real problem and may be fixed with a patch. In the case of hardware, if it can’t be fixed with software the entire device may need to be replaced.

But software engineering typically lacks the same fear of failure that hardware engineers deal with every time they sign off on a chip. In hardware, real fixes require a respin of a chip. That’s why hardware engineering groups have rigid methodologies and flows, and very expensive tools and equipment to simulate, validate, verify and debug these devices. In the future, this will be increasingly required for the software side, as well.

The move toward software-defined hardware is a recognition that designs need to be rebalanced and rethought. But it’s only a piece of the puzzle. As electronic content in system continues to increase, quality of the components has to increase at a faster rate to improve reliability of those systems. That means the same kind of discipline needs to be implemented across the supply chain, with continuous applications of test, verification and validation from inception all the way through to the end market.

The decision is no longer just about whether to build something in hardware or software. It’s now about how to make it more reliable, and that will be an interesting challenge for an industry that has been focused for decades on completely different metrics.



Leave a Reply


(Note: This name will be displayed publicly)