Systems & Design
SPONSOR BLOG

Playing Hardball With Software

Custom hardware can cut software development time and improve overall system performance.

popularity

By Frank Ferro
Software is never-ending, or so the axiom goes. It shouldn’t take long to convince anyone that has used an electronic device of the truth of this statement. The PC environment is the most obvious (and obnoxious) example with daily application software updates, at the most inconvenient times, coupled with regularly scheduled updates for the OS. Even embedded devices like media players and cell phones need periodic updates.

As a former product manager responsible for delivering WLAN chipsets, one of the biggest challenges we had was delivering software drivers. Even when the hardware was finished, we would have to do frequent driver releases. Bugs were not the only reasons for new releases. Driver releases were also needed to accommodate all the permutations of operating systems, chips sets and customer applications. If someone had said me at that time that a change in the hardware could reduce the driver effort by even a small percentage, I would have paid very close attention.

Improving software efficiency is even more important in today’s competitive SoC market where performance, cost and TTM are critical to achieve success. The ability to reuse software, however, is often at odds with the rapid expansion of the hardware. Hardware teams rightfully leverage new process technologies, adding as much functionality into the chip as possible. The expansion of the hardware is often done with little or no regard to how it will impact existing software.

So the question becomes: Is it time for a new SoC model? It may seem counter-intuitive to think custom hardware can make software portable. Yet this is exactly what needs to be done to maximize system performance, and at the same time, reduce the burden on the software team.

Let’s look at a video SoC architecture to illustrate this concept. Hardware designers often separate the DRAM into two or more channels to maximize the overall system performance and DRAM access efficiency. To do this, one set of hardware processors will access one channel of DRAM while another set of processors access the second DRAM bank. This ensures that the traffic load in the system is balanced over the DRAM banks. The problem for the software team starts when a change is made to the chip, either by adding another bank of DRAM or more processors, or both. The previously written software will no longer work on the new chip because the system has been re-partitioned to accommodate the new processors, along with the repartitioning of the memory banks.

Taking a slightly different approach to the SoC architecture could have avoided this problem. If dedicated memory “load-balancing” hardware was added to the on-chip network of the SoC, the overall system performance could have been maintained while allowing software portability. With the on-chip network load-balancing the traffic flowing to memory, there is no longer a need for the hardware team to partition specific processors to dedicated banks of DRAM. Since traffic balancing is now done automatically, any processor in the system can access any bank of memory. The load-balancing hardware has the effect of making the memory access “virtual” relative to the software. In this case, the hardware actually makes the software portable!

With software resources accounting for an increasingly larger percentage of the overall chip cost (which chip companies rarely, if ever, get paid for), this approach to SoC design offers the benefit of abstracting the software from the hardware without losing control of the hardware. The concept in the above example is not only applicable to memory access, but a similar concept can be applied to power management (see my last blog), security and other critical system functions.

As more and more hardware blocks are modified to be “virtual,” then the reduction in software effort will be significant—resulting in improved SoC system performance—critical to success in today’s competitive SoC market. The statement “software is never-ending” is likely to remain true, but with some forethought in the overall system architecture, great strides can be made to improve software efficiency and reduce cost.

–Frank Ferro is director of marketing at Sonics.



Leave a Reply


(Note: This name will be displayed publicly)