Systems & Design
SPONSOR BLOG

AI-Driven Verification Regression Management

Automation of repetitive steps with proactive management to measure efficacy and maximize efficiency.

popularity

By Paul Carzola and Taruna Reddy

Coping with the endless growth in chip size and complexity requires innovative electronic design automation (EDA) solutions at every stage of the development process. Better algorithms, increased parallelism, higher levels of abstraction, execution on graphics processing units (GPUs), and use of AI and machine learning (ML) all contribute to these solutions. However, perhaps the most fundamental technology is automation of repetitive steps through regression systems, with proactive management to measure efficacy and maximize efficiency.

It all starts with simulation

Forty years ago, many chip designers were content with hand-running a few logic simulations and looking at waveforms to check for proper operation before fabrication. This ad hoc process was quickly replaced with dedicated verification engineers, written test plans, and increasingly automated self-checking tests. The concept of a regression suite—a set of tests rerun every time that the design changed—was central to this evolution in chip verification. Typically, “runner” scripts were used to package each test and then execute the complete suite.

In subsequent years, this basic regression automation flow was supplemented in many ways. Simulations mostly ran register-transfer-level (RTL) models rather than gate-level netlists. Scripts grew more sophisticated to support server farms, grids, and the cloud. Code coverage and functional coverage metrics were gathered for each test and rolled up at the end of the regression run. Verification plans became executable, with pass/fail test results and aggregated coverage metrics automatically annotated onto the plan after each run.

Although the focus was on functional simulation of RTL models, in many cases other types of simulation runs were automated as well. These includes SPICE and FastSPICE analog and mixed-signal runs, fault simulations to check the fault coverage of proposed manufacturing tests, and any gate-level netlist simulations of the selected manufacturing tests required by the chip foundry. Although such simulations ran far fewer times than RTL regressions, they still ran multiple times and therefore benefited from automation and management.

Moving beyond simulation

Faster and easier regression runs made it more apparent how much of the verification cycle was spent on debug, so this became the next focus for automation. Regression tools added debug automation capabilities to sort failing tests into bins, triage them, categorize them by root cause, and perform root cause analysis (RCA) to guide verification engineers. Other features such as forward and backward temporal debug, transaction-aware debug, and reducing debug run time also reduced the amount of human effort needed to find and fix design bugs.

Yet another extension of capabilities came with support for verification engines beyond simulation. Hardware accelerators often ran many of the same tests as RTL simulation, so these runs were also automated and managed. Emulators and FPGA prototyping systems were typically decoupled from any testbench, and the tests they ran involved embedded software as well as hardware, but in some cases these were also linked to the regression management system.

Static design analysis such as lint, super-lint, and formal checks are different from simulation in many ways, but they are also re-run (regressed) whenever the design changes. They also produce pass/fail test results and coverage metrics that can be managed and annotated back into the verification plan. Managing all engines in one regression automation solution yields a much more efficient verification process, greatly reduces redundant work, and provides a comprehensive snapshot of progress at every point in the chip project.

Putting all the pieces together with AI/ML

As in so many engineering endeavors, and EDA in particular, AI and ML are greatly enhancing chip verification and regression management. Designers and verification engineers have a unique chance to hear the whole story at a May 1 webinar on boosting verification efficiency with Synopsys VC Execution Manager (VC ExecMan), a cutting-edge solution to streamline and enhance verification processes. It provides a unified cockpit across all engines for automated regression test planning, execution, management, result analysis, debug, and coverage closure.

This webinar will focus on the seamless integration of advanced AI/ML technologies such as Synopsys Verification Space Optimization (VSO.ai) for optimized coverage convergence and Synopsys Verdi Regression Debug Automation (RDA) for efficient debugging. The comprehensive capabilities of Synopsys VC ExecMan support the entire verification flow and enable advanced management methodologies such as continuous improvement/continuous deployment (CI/CD).

Webinar attendees will learn how to elevate their verification strategy with the latest AI/ML-driven advancements. They will be fully ready to automate repetitive steps, minimize manual effort, optimize their use of compute resources, enhance their verification results, maximize their coverage, and tape out faster with higher quality.

Taruna Reddy is a staff product manager for the Synopsys EDA Group.



Leave a Reply


(Note: This name will be displayed publicly)