Hybrid Emulation Gets More Hybrid

Possibilities increase for what can be done with this technology as profitability and investments grow.

popularity

Rising chip complexity is creating a booming emulation business, as chipmakers working at advanced nodes turn to bigger iron to get chips out the door on time.

What started as a “shift lift”—doing more things earlier in the design cycle—is evolving into a more complex mix of hardware-accelerated verification for both hardware and software. There are even some new forays into power exploration and on-demand emulation services, which can be rented for a fixed period of time to fill a gap in figuring out which IP blocks to use for a specific design. And while both simulation and FPGA prototyping are still key technologies for verification, mainly because the systems are less expensive or already in place or both, the appeal of emulation is expanding rapidly.

“Over the past couple of years we’ve seen a big shift,” said Jean-Marie Brunet, marketing director for Mentor Graphics‘ Emulation Division. “If you look at emulation versus simulation right now, 80% of the market is simulation and 20% is emulation. Over the next 10 years, that will be reversed.”

One of the big drivers for that shift is hybrid emulation, driven initially by the ability to develop hardware and software simultaneously. All emulation providers now offer this in some shape or form, sometimes in conjunction with FPGA prototyping, sometimes as a standalone solution. But even here, the definition of hybrid emulation is expanding as software and hardware are more tightly integrated.

“This is definitely more than just another marketing idea,” said Zibi Zalewski, general manager of the hardware division at Aldec. “The need comes from the industry. With constantly growing sizes and complexity of the SoC designs it is very important to test the whole SoC as early as possible based on the project code available at different abstraction levels. Verification in such an integrated environment provides much better quality and scope of testing, not to mention software layers are being executed at the whole SoC level, while hardware designers benefit from much wider testing scenarios.”

Hybrid emulation is seen as an increasingly important method for getting sub-40nm SoCs and processors out the door on schedule. For years, the big concern voiced by chipmakers was the inability of simulation to keep up with the size and complexities of designs. One of the common solutions was to divide and conquer, simulating each block and then debugging the integrated system. But as the number of IP blocks increases—there are now more than 120 blocks in some 16/14nm designs—more processing power is required.

One extreme to the other
Tools companies have responded to this need quite well, scaling up from block-level simulation offered by all of the major EDA companies to more integrated and expandable emulation offerings that can offer orders of magnitude performance increases over block-level simulation.

Emulators have been scalable for some time. They can be clustered together like servers paired with FPGA prototyping hardware and simulators. But more complexity is driving chipmakers to look at bigger iron solutions, and there is a sliding scale of performance and power.

At the very top of the scale there are now specialized simulators, such as those developed by Ansys, which can simulate everything from full devices to smart grids and cities. Walid Abu-Habda, Ansys’ chief product officer, said the goal at the very high end is real-time simulation. “Today, simulation happens before the design, which creates a disconnect between design and production. We engineer mechanical, thermal and electrical, but you can continue simulation after production with real-time simulation and find anomalies that could lead to product failures.”

Chipmakers haven’t even begun to grapple with that kind of performance and capacity, but they are finding that with hybrid emulation, or emulators plus FPGA prototypes, they are able to get hardware and software up and running faster so they can debug it more quickly. That approach can be used for much more than just debug, as well. It can provide much deeper insights more quickly into how different blocks interact, what impact that has on power, and what happens when they substitute one memory type for another or one IP block for another.

“You can get up to 50X improvement for the OS boot, and some customers are getting up to 200X,” observed Frank Schirrmeister, group director for product marketing for the System Development Suite at Cadence. “What’s important here, though, isn’t just the OS boot time. It’s how you get to a point of interest faster. Once you are there, you can do more interactions between hardware and software, and you have the ability to run about 10X faster.”

Hybrid emulation also provides a much deeper understanding of what is going on inside an SoC.

“This isn’t just about a Linux boot on the hardware,” said Tom De Schutter, director of product marketing for virtual prototyping at Synopsys. “You can begin to look at specific hardware tasks and do more in the context of the software. The other piece is that the system is so complex that you can do a full-chip hardware-software integration, and you can do subsystems and complex subsystems in the context of virtual prototypes.”

Improvements everywhere
This represents a sizeable shift for EDA companies. From the late 1990s until the end of the last decade, emulation was a pricey piece of technology in search of a market. While software teams had long recognized the value of emulation, they rarely had the budgets to buy emulators. What changed the economics was the demand by systems vendors that chipmakers deliver embedded software with their SoCs. Sales really began ramping up after that, particularly with the explosion of complex application processors for mobile devices.

That has significantly padded the bottom line of emulation vendors, adding lots of impetus to invest in new ways to leverage this technology. It also has raised the competitive stakes for mindshare among them. Hybrid emulation is one of the big steps forward in this area.

“Hybrid emulation enables both software and hardware teams to work on the latest version of the project without actually having the whole SoC finished or even in quite early stages of the project,” said Aldec’s Zalewski. “While some portion of the design is still virtual and not implementable to an emulator, such as a high-level model in a virtual platform together with the available RTL code in the emulator, it provides the whole SoC for early and synchronized testing by both teams. Until recently it was possible to do, but separately, with one team waiting for the other. That separation was causing module-level testing, instead of SoC-level testing when hybrid emulation is enabled. To actually do the SoC-level testing in the emulator you need to have the whole design ready and hardware implementable. This is the main benefit of going hybrid.”

Hybrid is now being extended even further. Beginning last spring, Mentor Graphics began rolling out a set of modular extensions aimed at power and networking design by adding virtualization into the system. The goal is to raise up the level of abstraction and cut time to market even further—and to sell more emulators along the way.

“With physical devices, to do timing the device has to be ready,” said Mentor’s Brunet. “With virtualization, the device does not have to be ready. The next step is virtualization of a lot of the system where you can load software.”

Mentor isn’t alone in this. All of the emulation vendors are in various stages of development of their own virtualization versions, adding a level of abstraction over in-circuit emulation for some tasks. But none of this is taking place in a vacuum, either. Chipmakers aren’t throwing away their old equipment, and tools vendors are updating those technologies, as well.

“Processor-based emulation is still slower than hand-optimized FPGA prototypes,” said Cadence’s Schirrmeister. “Users are taking an emulation approach for debug and fast bring-up, and they’re connecting that with an FPGA system.”

Synopsys’ De Schutter agrees. “A lot of what we’re seeing is either ARM-driven or ARC-driven designs. From a capabilities point of view, what you’re trying to determine is the way it interacts with the external world. A lot of markets have very similar requirements and you need different pieces for different activity. Part of that is virtual, part of that is FPGA prototyping. On the FPGA you can do system validation. And then on the emulator you have the big capacity for the full SoC. You can take the different pieces and combine the individual strengths of all of them.”

Emulation as a service
One of the big advantages of virtualization is the ability to use these tools remotely. That allows emulation to be moved to data centers, either in private cloud operations or in commercially run operations similar to the kinds of services Amazon and Google now offer to their IT customers.

In the past, security was the big concern for outsourcing anything. Some chipmakers had standing policies that anyone moving data out of the company would be terminated immediately. While security concerns actually have increased since then, data centers are proving themselves to be at least as secure as internally managed data operations, and in many cases significantly more efficient. EDA vendors are betting the same is true for emulation hardware.

“An emulator that is 20% utilized is a nightmare,” said Mentor’s Brunet. “This allows much better utilization of capacity.”

By adding virtualization, emulators can be stacked similar to blade servers in data centers, with capacity added as needed for a particular project.

“Users have been doing virtual private clouds all along,” said Cadence’s Schirrmeister. “All of the big customers have designed their systems that way with remote access to a box. To take that further and use a cloud mechanism, we have been making progress since 2000 as a way to have access to overage capacity. Typically, we have set this up at the customer site, but we also can host emulation here at Cadence. The next step is hosting in which you have private access to a machine. This isn’t necessarily cheaper. It’s always less expensive for bigger companies to host something internally if they need it 24 x 7. But if you’re a small company and you only need it for a week, it’s cheaper than buying it for a year.”

Changes at the chipmaker level
Just because better tools are available, though, doesn’t mean engineering teams are ready to use them. This has been a common theme in discussions about verification, where tool vendors insist that engineers need to learn to use existing tools more effectively rather than demanding new ones.

There is no shortage of help available in this area, of course. Mentor created a Verification Academy specifically for this purpose. All of the verification vendors run training courses in emulation. But the bigger problem is whether the chipmaker organizations can take advantage of that learning, because better utilization of hybrid emulation requires different groups to work more closely together.

Hybrid emulation also opens the door to more exploratory types of interactions between hardware and software.

“You don’t always have working software to start architectural exploration, but what we are starting to see is companies running software for the next design so they can do more exploration,” said De Schutter. “With that you can see what the effects are, what the architecture needs, and what the software needs. This is still in the conceptual stage. How do you change the software to improve the hardware?”

Brunet noted that networking companies are doing a version of that now using hybrid emulation. “For the networking market, it’s all about bandwidth and getting to market faster. You can still use simulation for the block level, but if you verify traffic at the full-chip level it’s much faster.”

The bottom line is there is no shortage of options in this market. Raw processing power and massive capacity opens all sorts of new possibilities for how technology that originally was developed for faster verification can be used from one end of the design flow to the other. The only question now is how chipmakers decide to assemble the pieces and the engineering resources, and what they ultimately decide will give them a competitive edge. And at this point, that remains to be seen.



Leave a Reply


(Note: This name will be displayed publicly)