Systems & Design
SPONSOR BLOG

Welcome Verification 3.0

When considering the future of verification, don’t forget the human factor.

popularity

Leave it to Jim Hogan, managing partner of Vista Ventures, to look further out at the changing horizon of verification than the rest of us and to make sense of it in what he calls Verification 3.0. In his executive summary, he outlined the significant advancements in functional verification over the past 20 years, such as hybrid verification platforms in Verification 1.0 and hardware/software co-verification in Verification 2.0. Verification 3.0 will take a more systematic approach with numerous trends that will shape the next decade. Hogan nailed some of the key developments and emerging trends in the semiconductor industry that will further define Verification 3.0, and concluded no one company will be able to tackle the challenge.

While most of Hogan’s analysis is sound, he missed a crucial leg in the quest for full verification coverage and that’s the human factor. Verification 1.0 was carried out by designers. Verification 2.0, particularly with constrained-random simulation, saw the rise of specialized verification teams more at ease with large, software-like projects than RTL code. This separation of skills and responsibilities has a dark side. Designers do not use UVM to explore and test specific aspects of their code prior to commit. They may have a range of linting tools available, perhaps also some existing tests, but no way to quickly and deeply explore the implications of their latest bug fixes, or implementation choices for a new or existing feature. Additionally, as regression testing grows, debugging failures often become a bottleneck when only designers can dig into the RTL code.

Verification 3.0 will have to tackle these shortcomings. There is already evidence, mostly in software but also in hardware, that agile development practices significantly improve quality and efficiency. Formal has a crucial role in agile RTL development. Designers can use formal to explore their code and quickly identify functional, performance, and other issues without the burden of building a full-blown verification environment. While this does not deliver on the promise of “correct by construction” –– SystemC and high-level synthesis did that in part, at least for some specific, datapath-oriented classes of designs –– the implications are significant. Formal experts, on the other hand, are familiar with RTL and, when it comes to debugging assertion failures, they can dive deep into the design, at times to the point of suggesting a comprehensive RTL fix. Pair programming, with less role boundaries and more code sharing, could partly reconcile the design-verification and hardware-software divides.

Hogan correctly identifies the expanding role of verification. Constrained-random simulation has mostly reached a plateau in scope and adoption. The portable stimulus standard and associated tools will deliver much-needed efficiency for use cases testing at the system level across various platforms. The practice of hiring scores of consultants to implement disposable SoC-level directed tests might soon become a thing of the past. Formal adoption will continue to grow and become essential to address some of toughest challenges of safety and security analysis and verification. Hard to reach, misuse cases—malicious or otherwise—are easy to miss during planning and unlikely to be hit during random testing. No technology more than exhaustive formal has that native inclination to treat safety and security with the attention they increasingly deserve.

Verification 3.0 from Hogan begins a discussion that will be debated at many industry events, engineering meetings and more. Welcome Verification 3.0!



2 comments

Jim Hogan says:

Thanks for the post. It should get interesting.

Sergio Marchese says:

Indeed. Probably the most exciting time to be a verification expert since Specman popularized the role.

Leave a Reply


(Note: This name will be displayed publicly)