Top 7 Verification Trends For 2017—Changes In The Game Of Ecosystems

In the coming year, look for verification to become a whole lot smarter.


As the year 2016 comes to a close, how did my predictions from last year hold up to reality? They were all about horizontal and vertical integration. Spoiler alert—they almost all have moved closer to reality. Going forward into 2017, some of the trends will intensify, but the most interesting trend to watch will be how the game of ecosystems in the areas of mobile, server, and intelligent systems (IoT) progresses, while, of course, also impacting verification trends.

But first, let’s look at the predictions from last year. In “Outlook 2016—The Year of Horizontal and Vertical Flow Integration” I said…

RTL simulation gets faster every year. Emulation throughput increases from generation to generation. FPGA-based prototyping offers the highest speeds and we have reduced its bring-up time into the week range, allowing it to be used much earlier than in the past. Bottom line, the increasing performance of the dynamic verification engines does reduce the need to abstract more aspects of the system. Users can just combine engines horizontally as needed and combine abstractions vertically from TLM through RTL to gate level, and even that transistor/technology level bringing in .lib files.

To be fair, in my role I certainly had above-average knowledge about our plans, but most of it has indeed come true, some of it beyond my expectations. I’ll intersperse what happened in my predictions below. The common theme is that verification will become a whole lot smarter. The core engines themselves continue to compete on performance and capacity, and differentiation further moves in how smart applications run on top of the core engines, and how smart the core engines are used in conjunction.

Here we go:

  1. Parallel simulation goes mainstream: The race towards increased speed and parallel execution in RTL simulation will accelerate. 2016 was a game-changing year during which Cadence acquired Rocketick. Together with flows and methodologies for automotive safety and digital/mixed-signal applications, parallel RTL simulation will look as good as never before.
  2. ROI of emulation moves into its use models: Differentiation for the two basic ways of emulating—processor based and FPGA based—will be more and more determined by how the engines are used. Specifically, the various use models for core emulation like verification acceleration, low-power verification, dynamic power analysis, and post-silicon validation, often driven by the ever-growing software content, will extend further, with more virtualization joining real-world connections. Yes, there will also be competition on performance, which clearly varies between processor-based and FPGA-based architecture depending on design size and how much debug is enabled, but the versatility of use models determines the ROI of emulation.
  3. Differentiation for FPGA-based prototypes moves into bring-up and software ease of use: Addressing designer’s performance needs for software development, FPGA-based prototypes are using the same core FPGA fabrics, so differentiation moves into the software stacks enabling fast bring-up on top of the hardware and enabling advanced software debug. Congruency between emulation and FPGA-based prototyping, using multi-fabric compilation that allows to map both into emulation and FPGA-based prototyping, will increase user productivity significantly.
  4. Horizontal integration between verification engines becomes even smarter: The individual differentiation of verification engines aside, horizontal integration will further increase. Differentiation is more and more determined by smart connections between the dynamic engines and into formal techniques, as I described in “Top 15 Integrating Points in the Continuum of Verification Engines” a while back. Cross-engine verification planning, debug, and software-driven verification (i.e., software becoming the testbench at the SoC level) as currently under standardization in the Portable Stimulus working group in Accellera, will enable further improved verification re-use between engines and cross-engine optimization.
  5. Vertical integration further increases flow predictability: Besides horizontal integration between engines—virtual prototyping, simulation, formal, emulation, and FPGA-based prototyping—the vertical integration between abstraction levels will become more critical in 2017 as well. Specifically for low power, activity data created from execution of RTL in emulation can be connected to power information extracted from .lib technology files using gate-level representations or power estimation from RTL. This allows designers to estimate hardware-based power consumption in the context of software, using deep cycles over longer timeframes that are emulated. As a result, flows become more predictable with implementation information annotated into pre-RTL and RTL verification as early as possible during the design flow.


  1. Verification becomes even more application specific: Extending the trends in 2016, verification flows definitely will continue to become more application specific in 2017, often centered on specific processor architectures. For instance, verification solutions optimized for mobile applications have different requirements than for server and automotive or even aero and defense designs. As application-specific requirements grow stronger and stronger, this trend is likely to continue going forward, but cross impact will also happen (like mobile and multimedia on infotainment in automotive).
  2. The game of ecosystems deals with new players: In the balance between mobile, server, and IoT, some significant shifts have happened in 2016 and are likely to accelerate in 2017. I wrote about the basic situation between Mobile, Server and IoT a while back. I extended the graph from that blog to show the other players in the graph associated with this post. The mobile domain seems to be stable, with Intel withdrawing from application processors in 2016. Between OpenPower and ARM, we see non-Intel based servers now, even though in pure market share Intel is still the 99% player. Intel aligns around servers and IoT, which makes the IoT space a split between power, MIPS, ARM, and Intel architectures. To make it even more fun, RISC-V is trying a disruption in that domain, with quite some momentum already due to its OpenSource nature. Definitely one of the most entertaining spaces to watch in 2017 and for years to come.

With all these trends playing out, 2017 will be another banner year for verification with interesting changes ahead.

Leave a Reply

(Note: This name will be displayed publicly)