2019 – The Year Of The “Dynamic Duo” Of Emulation and Prototyping

How emulation and prototyping complement each other during the verification process.


In technology, we are always trying to figure out when we have reached critical mass, have crossed the chasm, or even can be considered mainstream. We all have seen the adoption curves for consumer products. In design and verification technology, a distinct B2B setting with fewer end customers than in the B2C domain, the situation seems to be even worse as there is no “one flow” to design and verify electronics. So, when do you know when you have made it? When you have enough users in one place sharing their experiences, and they confirm how they use your products. And that’s exactly what happened in 2019 for emulation and prototyping.

On a personal note and thinking of technology adoption, of course, Clayton Christensen comes to mind. We have not only lost one of the greatest technology strategists far too early in life last week but also an amazing human being. Yes, The Innovator’s Dilemma has shaped things in my life and at work. But what really impacted my life the most were his thoughts on “How Will You Measure Your Life?” I read Christensen’s article first during a time of great turbulence and disruption in my life in 2012, and it helped me tremendously to set the coordinates on my life compass. Towards the end of the HBR article, he wrote: “Don’t worry about the level of individual prominence you have achieved; worry about the individuals you have helped become better people.” Even though I do not know him personally, I feel like I am one of those individuals. Thank you, Clayton Christensen.

Let’s get back to technology.

This is my ninth year leading Cadence’s product management efforts for hardware-based verification and software development. I have always been very vocal about how emulation and prototyping should work together. My last year’s blog on “The Changing Landscape Of Hardware-Based Verification and Software Development” quoted the example of Toshiba presenting at CDNLive Silicon Valley 2019. The year prior, in “Different Shades Of Prototyping And Ecosystems: System Development At CDNLive 2018” NVIDIA’s “pushbutton prototyping” was front and center. 2017’s post “DAC 2017: A Glimpse Of How The Future Is Enabled” featured SiriusXM’s combined usage of emulation and prototyping. In 2016, I wrote about “Balancing Emulation and FPGA-Based Prototyping for Software Development”, referring to Amlogic’s use of emulation and FPGA-based prototyping. Also in 2016, I wrote about MicroChip’s (at that time, still MicroSemi) balanced usage of both technologies in “Stories from the Village Called Hardware-Assisted Development”. In 2015, the combined use of emulation and prototyping was front and center in “Top 15 Integrating Points in The Continuum of Verification Engines” as one of the integration points. Already back in 2014, I had mused about the product management challenges my team was facing in driving the direction of both technologies in “The Agony of Hardware-Assisted Development Choices”, which, when I look back now, feels like a re-run of “The Agony of Choice” published in 2012.

Earlier last year, in “The Changing Landscape of Hardware-Based Verification and Software Development”, I detailed the technical advantages of using he combination of a custom processor-based emulation system and a commercial FPGA-based prototyping system.

Now it is time to declare victory! In December 2019, our combination of emulation and prototyping was featured as “Best of 2019” in “CDNS Protium crazy fast “Palladium-compiles” #1a for Best of 2019” with six detailed written user experiences and “CDNS Palladium wins back user mindshare is #1b as the Best of 2019” with eight detailed written user experiences.

The gist of the story is summarized in the following graph:

Design teams are using emulation for verification when RTL is less stable. Fast bring-up of multiple compiles per day as well as simulation-like debug are unparalleled in a processor-based emulation system as we have full control of compile, do not need to worry about timing closure of FPGAs, and debug is not intrusive.

But there comes the time during a project when RTL becomes stable. Now, speed becomes crucial for software development and hardware regressions. You no longer have to compile multiple times daily, so commercial FPGAs with their longer compile times become the right option. And with a unified environment, not only can the bring-up be done in hours from a stable emulation model, but also when a defect is found at the hardware/software interface or in hardware during a regression, it can be reproduced in emulation using simulation-like debug.

That’s a dynamic duo of engines!

The joint usage of emulation and prototyping took a while to became mature and mainstream, but we are clearly there as the user feedback shows. Is there a disruption ahead? Looking at the roadmap of high-capacity FPGAs, it is certainly not happening immediately in the next couple of years. But if we have learned anything from Clayton Christensen, we know that it is probably due to come. Until then, the dynamic duo of emulation and prototyping will be the right technology choice.

Leave a Reply

(Note: This name will be displayed publicly)