Verification 2.0: From Tool To Flow

Future verification will be based on use-case scenarios and systems, not just the hardware.

popularity

Recently, Cadence held a System-to-Silicon Verification Summit at which companies like Broadcom, Zenverge, NVIDIA, and Ambarella shared their experiences and visions for verification. In one of the keynotes, Brian Bailey shared his vision of how verification would transition from tools to flows.

Brian’s presentation was quite insightful. He started with a brief status of where we are currently at. He recalled how the industry has transitioned through the last couple of decades. In the 1970s, designers verified very small designs at the transistor level with simple operations and no concurrency using somewhat unstructured verification and simple patterns of 1s and 0s that were very similar to production tests. They were essentially doing confirmation by examination without concepts of coverage.

In the 1980s, design sizes were growing to the gate level, enabling early chip design for the masses with simple operations and low-level concurrency. Verification languages were introduced, enabling directed testing, and confirmation was still done by examination, now usually with waveforms but also with some checks that were made confirming primary objectives. Some code coverage was used at this point.

In the 1990s, designs migrated to register-transfer level (RTL) with a significant increase in complexity. Because tools for equivalence checking and static timing were not fully in place yet, the transition from gate to RTL was pretty messy. Simulation was able to deal with most designs, with small amounts of concurrency appearing. Intellectual property (IP)/reuse emerged and was used in limited fashion. This decade also found early adoption of constrained random verification that required predictors, checkers, functional coverage, and constraints.

Personally, I could relate to this decade very well. Brian’s description brought back memories of SICAN, where I helped build an IP development team in which we called IP at that time “design objects” to mimic software reuse. And while my designs in 1992/93 were done in a mix of RTL and gate level, by 1997 my teams had transitioned to RTL and synthesis.

In the next decade, the 2000s, I had experienced the EDA vendor’s perspective. Design complexity was now increasing rapidly with further transition to RTL and increasing amounts of concurrency. Simulation was no longer able to handle all stages of verification – special-purpose hardware was more commonly added as an accelerator and was also used in-circuit, just like my teams had already done in the 1990s with Quickturn’s System Realizer when we verified our audio/video designs. Formal verification tackled specific problems and hardware/software integration became an issue. While there was a lot of discussion about electronic system level (ESL), it was really the increasing amount of re-use that provided a working solution to deal with increased design complexity.

So verification was done at one specific point in the design flow on RTL that was facing a steady advance in complexity, it originally had the single purpose of verifying hardware, and RTL simulation was the anchor with the helping hands of emulation and formal. Stating that simulation performance now has hit a wall, Brian proceeded to introduce three new drivers impacting verification.

  • “IP Dominates Design”: Chips do not have to contain “secret sauce” anymore and design has moved to the “product level.”
  • “New Abstractions” have emerged: System complexity demands higher levels of abstraction, concurrency and resource sharing create new kinds of dependencies, and system functionality needs to be expressed in new ways.
  • “New Purposes” of hardware: Hardware is now the enabler for software, it becomes the overhead necessary to enable software to provide necessary functionality and performance.

ToolsToFlow

With these drivers in mind, Brian then analyzed how verification will migrate using the classic V-Diagram shown in this blog. While in the previous change from gate to RTL the existence of a verification flow was transitory, caused by the lack of tools for equivalence checking and static timing, Brian considered the transition from RTL to ESL a much larger jump. It has multiple levels of abstraction, equivalence checking is orders of magnitude more complex, and system-level timing is not formalized.

Brian concluded that this will not be collapsed in his lifetime, so he asked the audience to get used to verification being a flow following the design process top down and bottom up, addressing a “meet in the middle scenario”. For the top-down architecture process, resource utilization needs to be determined and system performance needs to be verified. The bottom-up assembly verification needs to incorporate IP integration, consider that only a small fraction of the design is unique, and take into account that some issues can only be addressed on accurate models.

The resulting flow requires continuous integration for designs in which functionality exists in software, and hardware and software must be designed and continuously verified together with unified notions of verification in both domains. At the connection points, verification engineers will deal with use case scenarios annotated with performance requirements. Specifically, for Brian, supporting software means that models of appropriate accuracy, fidelity, and performance are required at all stages in the flow. He sees users utilizing, ideally with consistent debug environments for hardware and software , multiple forms of execution – simulation at all abstraction levels, acceleration, emulation, prototyping, and hybrids of all of those.

So in summary we have exciting times ahead of us. Verification will become a flow based on use-case scenarios and deal with the system and not just the hardware, with “meet-in-the-middle” architectures. At the System-to-Silicon Verification Summit, both Broadcom and NVIDIA gave a first glimpse at some of this when they described their advanced verification flows including the hybrid use of emulation with virtual prototypes for early software bring-up and verification. I am eager to see where we will be next year at this time for the next summit!



Leave a Reply


(Note: This name will be displayed publicly)