Emulation’s Footprint Grows

Why emulators are suddenly indispensable to a growing number of companies, and what comes next.

popularity

It wasn’t that many years ago that was an expensive tool available to only a few, but it has since become indispensable for a growing number of companies. One obvious reason is the growing size of designs and the inability of simulation to keep up. But emulation also has been going through a number of transformations that have made it more affordable, more useable, and a more complete verification tool.

The changes have evolved along two different axes. One is cost of ownership, which includes maximizing utilization of the equipment while minimizing maintenance costs and the time it takes to use it effectively. The other axis involves extensions of emulation’s capabilities into areas that reach beyond simulation. Those include hardware/software co-verification and power analysis.

But emulation is still trying to find exactly where it fits into the flow and how it works with other tools. There is still a lot of room for improvement.

Cost of Ownership
In the early days of emulation, the box would be purchased by a specific design team and most probably used in an in-circuit emulation (ICE) mode. Here the emulator is attached to the real world and subjected to stimulus through rate adapters that could slow the real-world down to the speed of the emulator. “When you have a box that is in a lab, it has a cost and a specific use model,” says Jean-Marie Brunet, director of marketing for the emulation division of Mentor Graphics. “You need to have a physical target connected to it. You need a person to run the lab. You need a technician to maintain the lab. And access through the network is maintained by the technician. There is no economy of scale.”

Since then, the emulator has moved into the data center. Many companies have been consolidating their data centers to locations that have cheap electricity and where the costs associated with maintenance are part of the IT budget and distributed across multiple project teams. Those machines are then accessible from anywhere in the world.

In order to make that move some other technologies were required. “Emulation moved from the in-circuit-emulation (ICE) mode where the design under test (DUT) was verified with real traffic data to acceleration mode and then to virtualization mode,” explains Lauro Rizzatti, a verification consultant. “This is made possible with transaction-based communications between the emulator and the testbench running on the workstation.”

The same techniques also can be used to connect the emulator to other aspects of the design. “Hybrid emulation is where you bring together transaction-level processor models with the more accurate pieces of the hardware,” says Frank Schirrmeister, group director for product marketing of the System Development Suite at Cadence. “These all helped to make it more affordable.”

Another aspect of this was the standardization of the transactor interfaces. “The Accellera SCE-MI standard helps build the interfaces for third party connectivity,” says Zibi Zalewski, hardware general manager for Aldec. “The standard enables virtual platforms, transaction-level models (TLM) or simply a C++ testbench to be connected to the emulator. This, in turn, means that emulation can be used much earlier in the design flow. With the growing popularity of UVM, which adds significant complexity to the testbench, there has been an impact on the performance of simulations, and this limits the scope of testing. Emulation and SCE-MI again enable the emulator to be integrated with the UVM testbench to accelerate the simulation and extend the testing scenarios.”

But there are limits to this technology. “ICE is still an important use model, and can still be used in the datacenter, and there are some still emulators in the labs dedicated to ICE,” says Brunet. “If a chip has a non-standard interface, then how do you verify that in a virtual environment? You need to have a model and someone has to create that. Most of the time, the customer does not want the EDA company to create that model because they don’t want to share the information. So they either have to write the model themselves or continue to use ICE and the physical target. It is the latter that most of them choose.”

Usage models
While cost of ownership remains important, the biggest advances have been because emulation has tackled new tasks—some of which were never possible with simulation. “ICE used to be the dominant use model,” says Schirrmeister. “Since then we have added more use models from low power to architectural analysis, to software which can’t be done at the necessary speed with simulation. Emulation enables people to cover the necessary use cases.”

Mentor’s Brunet points to similar shifts. “The past two years have seen an acceleration of this trend,” he says. “In the past, emulators were just used for functional verification and the notion of verification is extending. The virtualization of the use model is expanding the type of users that can use the machine. Power is one. is another. This expands the horizon of what an emulator can be used for.”

The industry is awash with companies having power-related issues. “One company had problems with their IP not meeting the power budgets when plugged into the system,” says Preeti Gupta, director for RTL product management at Ansys. “They changed their flow to prevent a recurrence of the problem. The current methodologies focus on simulation and generally use the vectors written for functional verification. Those are repurposed for power, and often at the gate level, so it is too late to reduce that consumption. What was necessary was to utilize emulation activity to measure power early in the flow at RTL.”

In fact, power has become an important driver for emulation adoption. “Power emulation enables generation of RTL activity data, toggle data including simple weighted toggles, trading off time and accuracy,” explains Schirrmeister. “This dynamic power analysis can be linked into power estimation, where we can go from RTL to the gate level and get very good correlation and provide a good power picture, that is not just a relative consumption number, but can provide a very accurate number. At TI, they talk about accuracy levels around 95% of actual silicon.”

Another area where significant progress has been made is with software verification. “They have schemes where the processor workload may actually be running on the workstation and coupled to the emulator, so now the software team has a familiar debug environment,” says Drew Wingard, chief technology officer at Sonics. “They can single step through a piece of code. You need that kind of detailed analysis. The emulator does a better job than simulation because it can get deeper into the state space more quickly. Then they can ask questions to understand the power characteristics of their application.”

The emulator also affects how engineering teams work together. “It is still the case that the hardware guys don’t talk to the software guys,” says Brunet. “We also have this problem with power. Power guys do not talk to the guys doing functional verification. We had to bring them together. To run power, do not use the testbench designed for functional verification. Run the appropriate benchmark on the emulator and track the toggle activity. This is real usage. It is the same for hardware and software. They do not provide enough information for each other internally, and that creates a gap that needs to be filled. There are opportunities in that.”

The emulator companies are creating new use-models for emulators all the time, recent ones poking into security, functional safety and DFT.

Melding of the flow
But there are areas in which more work needs to be done to ensure that emulation plays a unified role within the verification flow. “Bring-up is easier, more predictable and straightforward,” says Schirrmeister. “It can usually be done within a week or less. But simulation is not going away and there is still a huge space for it. For example, emulation is only two state and cycle based. There is no concept of an X.”

Verification uses three major engine classes today. “Emulation complements simulation and formal by effectively verifying bandwidth, performance and system-level requirements for large systems,” points out , President and CEO at Oski Technology. “However, too many teams use emulation for finding block-level corner-case bugs that are found and debugged much more effectively with . We have a powerful chain-saw (emulation) and a precision scalpel (formal), and each should be used appropriately—and not misused for the wrong problems.”

With formal and emulation encroaching on simulation, it will need to refine its role in the modern verification flow. “While emulation and formal verification are, in general, used for different stages in verification, it is clear that both are encroaching on the traditional domain of simulation,” says David Kelf, vice president of marketing for OneSpin Solutions. “As formal captures more of the mainstream verification tasks and emulation is increasingly leveraged for simulation acceleration, a time can be seen where an IP block is formally verified and passed directly to the emulator. Simulation will be relegated to a more minor function. For this to happen, better support of assertions is required in emulation or, alternatively, redirected testing through a mechanism such as may be necessary.”

Accellera is working on a new verification standard that will help with flow integration. “Portable Stimulus will help to tie emulation into the verification continuum by providing a single model of verification intent that can drive virtual prototypes, simulators and emulators,” says , chief executive officer of Breker. “This will further reduce the costs of model development and ensure that the most appropriate use-cases are exercised on the emulator.”

Market penetration
Having seen how far and fast emulation has progressed, it is reasonable to ask how much farther it has to grow. “Networking and processor types of devices, such as mobile and multimedia, are already at 90% penetration,” says Brunet. “They have big chips with a lot of computation, multi-processor so that both hardware and software have to be verified. From a needs perspective they have all of their needs met. Other markets are growing very fast although some remain fragmented.”

Schirrmeister is in full agreement. “There are oodles of growth potential left.”

Both vendors see many new markets that are waiting for them and are adding the necessary pieces as fast as they can. Automotive, medical, security, storage, mil/aero—everywhere they look they see new potential. It is clearly a good time to be an emulator provider.

Related Stories
Too Big To Simulate?
Traditional simulation is running out of steam with autonomous vehicles and other complex systems. Now what?
Power Limits Of EDA
Tools aid with power reduction, but they can only tackle small savings in a locality. To do more would require a new role for the EDA industry.
Gaps In The Verification Flow
Experts at the Table, part 1: The verification task is changing and tools are struggling to keep up with them and the increases in complexity. More verification reuse is required.
Tech Talk: Power Emulation
Using hardware-assisted verification to tackle complex power issues in SoC design.



Leave a Reply


(Note: This name will be displayed publicly)