Systems & Design
SPONSOR BLOG

Bridging The Gap

What still needs to be done so that application software isn’t dependent on specific hardware.

popularity

By Frank Ferro
Today’s industry shows are feeling a lot like deja vu…tablets, tablets, smartphones, smartphones. The recent CES felt very similar to Mobile World Congress (MWC) with all the emphasis on smart phones and pad computing.

Frank FerroWhen I first started attending MWC—it was called 3GSM in those days and held in France—semiconductor companies seemed somewhat misplaced because 3GSM was considered a “systems” show. It was a place where cell phone manufacturers and software applications providers tried to sell their wares to mobile operators. Semiconductor companies were considered near the bottom of the food chain, usually giving cell phone manufacturers only what they asked for.

The wireless semiconductor industry, as we knew and lived it, would soon change due to two major forces: Cell phones moving from a business tool to a consumer device, and the shrinking of process technology geometries (i.e. Moore’s Law). Cost pressures forced cell phone manufacturers to begin shedding their semiconductor teams, moving much of the systems knowledge and software expertise over to semiconductor companies. To compete effectively, semiconductor companies “added value” by building reference designs and providing the software stack. Suddenly the chip was only part of the solution because a large percentage of resources were focused on system design, which includes software and applications.

Enter the 2008-2010 downturn where no one was left unscathed. Cost pressure was a common theme among semiconductor companies. It was hard to justify the cost of maintaining a systems team in-house, forcing reductions in design resources, outright selling of the business, or consolidation with other companies in a similar situation. Given the reduction in resources, one way for semiconductor companies to continue to compete in this market was to outsource many of the functional IP blocks. Purchasing IP was needed for basically the same reasons that forced cell phone companies to turn to semiconductor companies—Moore’s Law (45nm going to 28nm process technologies) and cost pressures.

At 45nm and below, most SoCs support anywhere from 50 to 200 unique cores—making it challenging for any one company to have the expertise to develop all the IP in-house (e.g. audio, graphics, video, wireless, etc.). To help simplify the burden of managing all this outsourced IP, IP companies began providing complete subsystems for a number of critical functions, such as graphics and video. And today, we are even starting to see a single IP block serve as a subsystem that can optimize bandwidth and keep costs down. Providing subsystems is a way to reduce the amount of external IP that has to be purchased. If this evolutionary trend continues, it will be the IP companies that will now provide the key ‘value-add’ by combining many of these subsystems into complete systems solutions. This is a significant crossing of the chasm for system-level design, and it has far-reaching ramifications for slashing costs and easing system complexity, as well as a complete re-arrangement of that once-static SoC “food chain.” More often than not, microprocessors used to serve as the starting point for the SoC puzzle. But today we are seeing some shuffling and re-prioritizing as SoC designers look to other IP companies to help them manage all their on-chip traffic and seamlessly integrate all their third-party IP on the chip.

There is no doubt that the process of designing complete systems solutions has begun, but I don’t believe this transition can or will happen in one step. The most likely trend is for IP vendors and semiconductor companies to collaborate in developing IP platforms as a first step. This platform could include key IP subsystems connected together with an on-chip network fabric along with software. Using an advanced on-chip network will allow platforms to be easily configured for various system requirements, and using software registers in the network will allow these platforms to be “virtualized.” This means application software can take advantage of the hardware without having to be dependent on specific hardware.

The overall effect of this approach will reduce development costs by allowing hardware platforms to be used across multiple systems and make software much more portable across these hardware platforms. Again, cooperation between both the hardware and software teams is critical for this goal to be realized. But given the potential benefits of this approach, it is an attractive and effective way to develop highly competitive products with limited engineering resources.

–Frank Ferro is director of marketing at Sonics


Tags:

Leave a Reply


(Note: This name will be displayed publicly)