Prototypes Proliferate

What will it take to make hardware prototyping as ubiquitous as emulation?

popularity

Hardware prototyping and emulation have been two sides of the same coin ever since the FPGA became a commercial success. Early emulators were all built from FPGAs, and most were used in-circuit, much like prototypes are today. More recently, emulation has become a major piece of the verification flow, to the point where most emulators now use custom chips to power them. At the same time, there has been much less advancement in prototyping.

What will change, and will change inject enough investment into prototyping such that it becomes as indispensable as emulation?

“Prototyping is following the same maturity curve that emulators did, but they are farther behind on the curve,” says Doug Amos, product marketing manager for ASIC prototyping at Mentor, a Siemens Business. “This is because prototyping has some unique problems.”

The need for speed
Before getting into the problems it may be helpful to define the space that it plays in. “An undisputed advantage of prototyping is speed,” says Zibi Zalewski, general manager of the Aldec Hardware Division. This is what makes prototyping unique and inherently creates its problems.

“The prototype is nothing more than a pre-silicon model that has a combination of speed and accuracy that you don’t have in anything else,” Amos agrees. “However, you don’t get the same visibility. You can draw a three dimensional graph of visibility, speed and accuracy and you can plot things like virtual models, emulation, simulation and prototyping. They all have a place in that matrix. Prototyping can do some of the other things, but there are better ways to do them.”

Where solutions fall within that graph has a lot to do with the quality of the tools and the team using them. “It is a question of the results you want to achieve and how much effort you are willing to spend,” says Frank Schirrmeister, senior group director for product management and marketing for emulation, FPGA-based prototyping and hardware/software enablement within Cadence. “The difficulty is that you have to remap your design to a different target technology. The challenges are related to clock, memory and others. So there are different library components and different target speeds.”

The same could have been said for early emulators. “The complexity grows with the size of the design, and that influences the popularity of prototyping,” explains Zalewski. “There are automatic setup tools available. But that usually impacts the speed, which is the biggest benefit of prototyping. If the speed decreases, users consider using the emulation with automatic setup, advanced debugging and MHz speed range.”

Several ways have been developed to make this easier. “Many people try and cut the design down so that it fits in an FPGA,” says Schirrmeister. “If you have not cut the design down enough, or you get close to full utilization of the FPGA, then you stretch the place-and-route tools and it becomes more difficult to close timing.”

And there is a strong desire to fit the design into a single FPGA. “A third to half of all prototypes are single-FPGA,” says Amos. “It is a bonus if you can do that because it just goes faster. When you are in one FPGA, you can get about 40MHz, but as soon as it grows and splits into two FPGAs, it is likely to drop to about 10MHz.”

Automation is a double-edged sword. “We use the same front end compile technology for emulation and prototyping,” continues Schirrmeister. “Automation tends to limit you on speed. With full automation, a design will typically run 3X to 10X the speed of emulation, but if you want to get to tens of MHz so that you can run PCI express natively, then you have to optimize manually. That is the big tradeoff – spend more work and get it to run faster.”

In many cases, it is getting to speeds that enable live-data streaming that sets the bar on performance. For many companies this means waiting until the bug rate has gone down enough such that they can afford to spend the effort to optimize speed.

Advances in emulation

Emulation started off in the same place that prototyping remains today. Why did that happen?

“There have been great strides on the emulation side to provide more automation around some of the boundaries,” says Drew Wingard, CTO at Sonics. “For example, if you want to run the processor virtually on the host, then that can talk to the emulated system. That opened up new use models.”

The use model that really brought about development of emulation was RTL verification. Designs were getting larger and simulators were no longer enjoying the ride from faster processors with each new technology node.

“The emulator is now part of the verification flow and you can’t wait until the RTL is mature because you are in the process of maturing it,” says Amos. “Prototyping uses the FPGA differently than early FPGA-based emulators, and some people still use them today. But the way FPGAs are used in emulation boxes is fairly dumb. By this I mean that an FPGA is a wonderful piece of silicon that has all sorts of amazing clocking systems, collections of logic, memories, DSP blocks etc. and in most cases they are not utilized fully.”

Many of the new use modes were made possible by the investments in emulation hardware. “Emulators based on custom chips use a different approach, because we know the communications, we know the routability. We do not have to deal with clocking because we reference a higher rate clock that enables us to align all of the clock edges,” explains Schirrmeister. “We have pre-prepared memory models and we do not have the same partitioning issues.”

Custom hardware also provides other benefits. “Emulators, particularly those based on custom chips, have a number of advantages,” says Dave Kelf, vice president of marketing for OneSpin Solutions. “They tend to offer faster compile times and therefore a tighter debug turnaround loop, they have greater visibility into the design, and have better connections with simulation and the EDA flow.”

Prototype penetration
“A recent survey said that 80% of users were doing FPGA prototyping,” claims Schirrmeister. “Even small designs fit into a single FPGA use prototyping, but most of those would just do it in-house. In the IoT space, providers often have reasonably priced boards to prototype your design. Same with ARM, that will often contain a little FPGA board.”

Mentor’s Amos says that “The penetration of prototyping is high and the penetration of commercial hardware within those is pretty high. There are very few customers who still make their own boards in volume. There used to be a lot more in the past but they have switched to commercial hardware. The reason to stay making your own is pure volume. If you need hundreds of copies, then by developing in house you can save money.”

The predominant use model for prototyping is software development. “It is the hardware guy who decides what goes into the prototype board and the software team who decide how many they can afford to buy,” points out Schirrmeister. “Once silicon is back, you switch to the real system. So the penetration will increase because not using a hardware-based method for verification is really risky, especially if software is involved.”

Fig 1: Total Cost of Ownership for Emulation vs. FPGA-Based Prototyping Platforms. Courtesy MicroSemi Corporation

That means the timeframe during which the prototype is most useful is fairly small. “Expanding it would mean the whole RTL code development is done extremely fast for the whole chip,” says Aldec’s Zalewski. “Right after simulation, verification engineers would have to map the whole design to FPGA boards.”

Amos agrees. “You have to wait for the RTL to reach a certain level of maturity before you can think of bringing up in the prototype. The window of use is quite finite from when the RTL becomes mature to when you get first silicon back. It also may take weeks or months to bring the prototype up, and this chews into that window. And yet people still do it.”

Hichem Belhadj, chief systems architect for Microsemi Corporation puts some numbers on it. “Emulation platforms once mastered could be set up, configured and brought up in days. Most of Microsemi V&V teams are now capable of achieving 2 to 3 days bring up time for several SoCs. FPGA-based prototyping vehicles took several weeks to months due to the tedious manual partitioning and the need for multiple iterations to achieve timing closure. The bring up time for FPGA-based platforms could be shorten when the team is made of FPGA-design experts.”

Given the difficulties related to this, alternative methodologies have been developed. “SoC designs are first modeled using virtual prototypes, so the software teams can start working on drivers, firmware and application layers,” adds Zalewski. “In parallel, hardware teams work on RTL, simulate submodules, and integrate those into the SoC. This process takes a lot of time, and software teams need to rely on high-level models. Emulation can be handy with co-emulation mode connecting virtual platform models and available RTL sub-systems. That allows the software teams to work on incrementally growing the hardware part instead of virtual models until it is possible to do SoC level prototyping.”

Changes on the horizon
It might appear that the problems associated with prototyping would keep it to the narrow niche in which it fits today, which is primarily as a validation tool rather than a verification tool.

“Some teams do use FPGA prototypes for verification, where they run real test payloads and look at results, but most use it for validation,” says Amos. “This is the first chance you have to build a real system before you get silicon and you can validate the original premise of building the system.”

But what if prototypes became more useful for verification? They lack two capabilities today. One is visibility and the other is that it is difficult to get tests developed in the traditional verification flow to run on the prototype. This is about to change.

Accellera has been working on the Portable Stimulus Standard, which will provide a new way to model system intent. From that model, tests can be generated that will target multiple platforms, including simulation, emulation and prototyping.

“Portable Stimulus enables prototyping to become another tool in the tool box instead of just being a validation tool,” says , CEO of Breker. “This is primarily enabled because we can use it to fill in coverage holes. The graph defined in Portable Stimulus defines the scenarios that need to be shown to function and from that we build self-checking testbenches that can run on any number of platforms.”

While the prototype would still not be useful for debugging, it could be used to run large numbers of testcases much faster and cheaper than is currently possible with emulation. When this becomes reality there will be a role for prototypes much earlier in the development flow. They will not necessarily require the fastest execution speeds, although they can utilize that as the prototype matures.

The use of the prototype for hardware verification may have a profound impact on the return on investment for developing hardware prototyping systems further. While this places a heavy burden on the Portable Stimulus Standard, it is yet another reason why this standard may be very important for the entire industry and why it is important that users look at this standard now.

The review period has been extended until the end of October. This standard is potentially the most impactful development since the introduction of RTL. It is important that verification moves up in abstraction and covers not just what we do in verification today, but where it needs to migrate to in the near future.

Related Stories
FPGA Prototyping Gains Ground
The popular design methodology enables more sophisticated hardware/software verification before first silicon becomes available.
Portable Stimulus Status Report
The Early Adopter release of the first new language in 20 years is under review as the deadline approaches.
Hybrid Emulation
Experts at the Table, part 2: Finding the right balance of performance, visibility, turn-around time and verification objectives.



Leave a Reply


(Note: This name will be displayed publicly)