Which Comes First?

Improving software functionality and efficiency is critical, but hardware will always be the starting point.

popularity

The increasing numbers of software engineers inside of hardware companies, coupled with predictions about how the software stack will increase over time to include applications, has produced a lot of speculation about what the starting point should be for design. Will it remain the hardware? Or will it be the software?

The answer, gleaned from literally dozens of interviews, is beginning to come into focus across the design industry. While software is a great starting point for function, performance, ease of use and integration, it doesn’t have enough structure or stability to dictate designs. In fact, the most stable thing in application software may be the set of application programming interfaces to the operating system. And for operating systems—general and RTOSes—as well as virtualization layers running on Type 1 hypervisors and embedded software, it’s the hooks into the hardware.

But the real value in software isn’t that it’s a stable platform. It’s the opposite. The ability to adapt it for many uses ranging from controlling hardware functionality all the way to managing complex interactions that continually redefine the user experience are what make software so attractive for chipmakers. Software is every bit as important as hardware—but hardware is also every bit as important as software. Some things will always work better in hardware than software, and vice versa.

Coming to grips with this reality is difficult for hardware and software teams, because they often feel threatened by each other rather than part of the same team. Hardware teams worry they will be replaced by software engineers. Software engineers scorn the dictates of hardware, which increasingly are a complex set of somewhat rigid rules governed by process technology, complex flows and the laws of physics.

Both are essential, and both do better working in sync with each other. Designs need to be interlaced between hardware and software from the earliest conceptual phase to the final verification signoff, but they also need to communicate at every step of the design process the way vertically integrated chipmakers would create a design, test it in the fab, then take the results back to tweak the design. The same process needs to happen between hardware and software teams, and there needs to be a methodology for doing that. So far there isn’t one.

Functionality defined by the software needs to be designed in hardware, and then the software has to be able to take advantage of that hardware more efficiently and effectively than if each was written by itself. As an industry we talk a lot about integration being the big challenge, notably re-usable blocks of IP, but integration also needs to be viewed within the design flow and adjusted so that software can be included. For example, battery life in mobile devices can be significantly improved if software engineering teams have some idea of how calls to memory are affecting power utilization. And software performance can be improved if the hardware can execute functions in a specific order, possibly using a dedicated processor core or memory.

It’s time to think of the system as more than just a compilation of parts. Those parts have to be developed as part of the same process, which may force some significant changes in the way things are done and require communication between tools in ways they have never been used before. This is an interesting challenge, and it’s one that needs to be solved for IC design to make the next big steps forward.

–Ed Sperling



Leave a Reply


(Note: This name will be displayed publicly)