‘Good’ Vs. ‘Good Enough’

Increasing the amount of software being created with hardware is pushing fundamental changes into chip design and verification.


By Ed Sperling
The decision for when a chip is ready for tapeout is changing—both in time and sometimes in terms of who’s actually making that decision—as the amount of software being developed by hardware companies continues to grow.

At the root of this shift are two very different concepts about what constitutes a market-ready product. For SoC engineers, fixing bugs after a chip has been taped out or, worse, shipped to an end customer can be an extremely costly mistake. At advanced nodes in a competitive market those kinds of mistakes can kill careers and sometimes even companies. But for software engineers, millions of lines of code and an almost infinite number of possible interactions mean that software can be functionally acceptable and debugged over time with service packs and software version updates.

“This is the service-pack two syndrome,” said Frank Schirrmeister, director of product marketing for system-level solutions at Synopsys. “On the hardware side the product needs to be finished or it means an expensive recall. They have a Sword of Damocles hanging over their head. But with software there can always be another release, and with the software becoming more important there is an effect on verification. The verification used to be complete functional verification of the device—all functions and corner cases in System Verilog. That isn’t true anymore.”

Software last, software first
Just how companies are looking at this issue—or whether they even consider it an issue at all—frequently depends on their starting point. While the number of software engineers inside chip companies is frequently larger than the number of hardware engineers, not all software engineers are involved in the same tasks.

To begin with, calling someone a software engineer is like calling someone a hardware engineer. A chip architect can be a hardware engineer and so can a verification engineer. And a software engineer can work on firmware, applications, or high-level software models. The tasks are different, the skill sets are different, and the concerns vary for each of them. The software being developed by various companies also can vary greatly, from firmware and embedded IP and processors to end-user applications. (See Figure 1)

Fig. 1: An STMicroelectronics software stack

Fig. 1: An STMicroelectronics software stack

Second, some companies view software as driving the hardware while others see it the other way around.

“We’re going through a transition period right now,” said John Bruggeman, chief marketing officer at Cadence. “The market definition of what is good enough will evolve.”

Bruggeman said that embedded software already has moved in this direction where increasingly there has been a willingness to accept less perfection in the consumer space, while in automobiles and airplanes companies were not tolerant of any faults. “Acceptance of Linux accelerated in consumer devices faster than expected, but mission-critical backed further and further away from it.”

At least part of the software-first approach is based more on hardware platform concept, similar to what Apple has done with the iPhone, iPad and iPod and what Google has done with the Android operating system. ARM and MIPS have been focused on a similar approach, creating platforms with open interfaces for the software. Bruggeman said that approach expands the job of EDA companies from tools for building hardware to integrating the software and IP that either run on those platforms or which become part of those platforms.

This is almost opposite of how companies like Intel, AMD and Nvidia see the market, however. For those companies the hardware is the most important and any drivers, middleware and applications need to work with the hardware rather than the other way around. What’s most important in this sphere are performance, power and backward compatibility. The x86 instruction set must work across all devices in which an Intel chip is used, whether it’s a mobile Internet device or a supercomputer, and the tools used to create those chips are an evolution of those which have always been used to create digital chips.

Jim Kenney, director of marketing for Mentor Graphics’ Emulation Division, said that in the case of advanced chip companies, the focus is on device drivers and connecting to standard interfaces such as USB. But he said the approach taken by those software engineers is changing, too.

“We have one customer that uses up to 20 hardware engineers running emulation,” Kenney said. “They had 180 software engineers. The hardware guys offered the emulation machine to the software guys, but no one used it. When asked the reason, they said they didn’t know how to use it. So one guy went home over the weekend and created a prototype environment for them. Now all the software engineers are waiting in line to use the emulator. They were able to get a three-month jump on software debugging. “

ARM and MIPS fit somewhere in between these two models. Sam Stewart, chief architect at eSilicon, said managers will choose ARM and sometimes MIPS CPUs because they know there is plenty of software available as well as the tools—and lots of expertise to work with it.

“There is a strong impetus to choose ARM based on the software that’s available,” Stewart said. “Managers choose those platforms because they’re fearful of the software and its effect on development schedules. With ARM that’s not as much of an issue. There’s a standard interface, lots of debug tools, and the code is written in C and can be compiled on anything. And if you go to standard software your schedule is contained.”

Debugging hardware and software
But while hardware verification is relatively stable and proven, verifying software is less automated because there has never been as much attention paid to it outside of markets like mil/aero and automobiles. All the major EDA companies are exploring the possibility of either buying or growing their internal capabilities and creating automation capabilities similar to what was achieved by Mentor’s customer.

“The natural order of things is to get a chip to a certain point, document the instruction set, document the power and throw it over the wall to the software guys,” said Mike Gianfagna, vice president of marketing at Atrenta. “You need to turn the arrows the other way. The software needs to drive the hardware. At first that seems counterintuitive because we’re in an integration and not an authoring environment. But what you end up with is that the hardware isn’t any harder but the software is more structured and more reliable. And if the software works better, the product is more accepted.”

Leave a Reply

(Note: This name will be displayed publicly)