The real growth in emulation sales is well outside of the market this technology was intended for. Why now?
By Ed Sperling
Emulation was developed for verifying complex ICs when simulation was considered too slow. After more than a decade of very slow growth, however, sales have begun to ramp.
There are several reasons for this shift. First, SoCs simply are becoming more complex, and the amount of verification that needs to be done to get a chip out the door can bring simulation to a crawl. Design teams have a choice of delaying time to market, reducing verification coverage, or buying an emulator.
Second, emulation is becoming the platform of choice for verifying complex software that is sold along with an SoC. Software teams have been grabbing time on emulators whenever possible for the past five years because their own tools budgets were a fraction of what was spent on the hardware side. In software, the value has always been seen in aggregate numbers of programmers, not in tools. That tide is shifting, however. Software engineers now outnumber hardware engineers inside of most chip design companies, but they work together in teams that typically are headed by one person with an overall budget.
And third, emulation is finding a home in markets where it was never even considered in the past, such as networking, IP, and power. Verification that works on a complex SoC at the nano scale can work on a macro scale, as well.
The result has been a wholesale shift in how emulation providers approach the market, as well as how these very expensive tools are used—something that is reflected in the recent earnings reports of emulation providers such as Mentor Graphics, Cadence and EVE. While they still are being used for hardware debugging, they’re also now being used for other purposes that extend well beyond the SoC hardware.
“Teams are now using emulators for bigger software and bigger tasks,” said Jim Kenney, director of marketing for Mentor Graphics’ Emulation Division. “Politically, the software teams are expanding their influence. There has been an aggressive move to step in and debug device drivers with emulation and make sure the chip doesn’t behave badly. We’re also seeing IP providers using emulation to verify the device drivers under RTOSes.”
Mentor recently addressed this shift by adding a virtualization layer to its emulator, allowing software engineers to run tests on their PC in keeping with the way they normally write code and debug. In contrast, most hardware engineers run their tests in a lab.
“We’re also seeing other uses for these machines,” said Kenney. “New networking companies are verifying their switch fabrics. They’re running multiple chips and throwing lots of Ethernet at it and looking for bottlenecks and dropped packets.”
Mentor isn’t alone in seeing new uses for emulation. Michael Young, director of product marketing for system design and verification at Cadence, said that one of the recent trends is to use emulation for power-performance tradeoffs.
“Power domains aren’t new, but some companies are just starting to implement them,” said Young. “You may have 1,000 power domains in a supercomputer. All of them need to be verified. We’re also seeing more being done with assertions. They’ve been supported for years, but verification engineers typically don’t know enough about design to write assertions. That’s the job of the system engineer. And now that we have hardware-software interaction, how do you deal with assertions for both?”
Of particular interest is what Young calls the “gray areas” of a design—not quite good or bad, which will degrade performance of a system if not addressed. That can happen in particular with legacy software, which in some cases was created more than a decade ago.
Simulation’s limits and extensions
Some of this can be done in simulation, of course, if the market is captive and not time-sensitive. But simulation, at least from a hardware standpoint, tops out at about 60 million gates. Emulators are now handling up to 1 billion gates, and there is work under way to double and triple that number.
An alternative approach is to speed up the simulation. Startup Nimbic has been pushing its heavily parallelized chip-package-system design development environment. At present it also is the only company that is seriously pushing cloud-based versions of this technology, as well, where the hardware and tools are available for rent.
“Large board and package and signal can run for weeks without acceleration,” said Raul Camposano, Nimbic’s CEO. “If you can accelerate that by a factor of 10 you can do it in a day. We have run large problems on 1,000 cores and gotten 500 to 600 times acceleration.”
Whether it’s emulation or simply faster simulation, the trend is very clear. Complexity is forcing companies to buy more tools and newer tools, and it’s far exceeding the capabilities of their internally developed tools and simulation farms. EDA tools are once again in vogue, viewed at least as a way of getting the job faster and possibly as a competitive edge. What a difference a couple process nodes make.
Leave a Reply