Gaining a holistic view of verification progress without the manual effort.
Design verification has been the dominant portion of chip development for years, and the challenges grow bigger every day. Single dies continue to grow in transistor count and complexity. Advanced techniques such as 2.5D and 3D multi-die systems and emerging technologies such as wafer-scale integration pack even more transistors and functionality into a single device. This situation has created a verification gap, in which the required verification effort grows even faster than the design effort. This has several negative consequences for project teams.
It is challenging enough to be able to generate and run enough tests to verify the design, but trying to debug failures has become a huge schedule bottleneck. Even when all tests are passing, coverage goals may not have been achieved, leading to the tedious process of coverage closure by tweaking constraints or writing additional tests. The metrics gathered are not just from simulation: emulation, formal analysis, and static verification methods must also be managed and factored into the entire process.
Figure 1 shows the traditional flow used by many chip development teams. While there is usually some automation, such as running simulation tests, there is much manual work involved. Metrics from multiple verification engines must be merged and rolled up in a way that accurately reflects the state of completeness. Debug is often managed by hand, as engineers re-run and examine the results for failing tests. Links back to the verification plan, design specification, and requirements management systems are typically weak or non-existent.
Fig. 1: Traditional verification flow with manual processes.
Given the amount of manual work involved in this flow, it is simply impossible to hire enough engineers, especially system-level designers and verification experts, to successfully complete today’s large chips. Progress has been made in the past to help automate this process with a verification management system (VMS) that makes it easier to run tests on all platforms, merge coverage results, and annotate those results back to the verification plan. However, by itself, this approach does not help the all-important debug stage, which has remained decoupled.
The verification management capabilities must be integrated into the debug tool to provide a complete solution. This reduces debug turnaround time (TAT) and leverages the knowledge embedded in each step for a holistic view of verification progress. There are several key requirements for such a solution. For a start, it must be easy to set up and automatically run simulation, a process called execution management. Figure 2 shows this flow, which can be extended beyond simulation to other verification engines.
Fig. 2: Automation of test execution.
As shown in figure 2, this solution must include an application programming interface (API) that allows engineers to start testing, post test results, stop testing cleanly, and generate custom reports. All these tests generate coverage results, and the VMS must efficiently support management of exclusions for specific coverage targets as well as automated aggregation and merge of coverage results. As shown in figure 3, the VMS capabilities must be integrated with the debug system for the most effective verification flow.
Fig. 3: Automation of coverage management.
With the automation shown in figure 3, coverage merge and rollup are completely automated for all platforms. The results must be annotated to both the verification plan and the design specification with no manual effort required. Figure 2 shows an example. The verification plan includes a list of features to be verified, and coverage results are annotated onto these features so that the management team can assess progress and the verification engineers can determine next steps. The features in the plan are linked to highlighted sections in the design specification.
Fig. 4: Automated verification planning.
While this set of requirements may appear daunting, there is a commercially available solution today that satisfies all of them. The full set of VMS capabilities is now available within the Synopsys Verdi debug and verification management platform. It provides all the capabilities discussed above, with all screenshots taken directly from Synopsys Verdi platform in action. It provides a superset of traditional VMS features, including execution management, coverage management, and verification planning.
Integrating verification management with debug allows engineers to take advantage of the many innovations in Synopsys Verdi, including the use of machine learning (ML) to group failing tests likely due to the same bug. As shown in figure 5, this enables the automatic creation of smart probes and their use in test re-runs for deeper diagnosis of failures. ML also comes into play to accelerate the root-cause analysis (RCA) of failures and eliminate a great deal of manual debug. A white paper is available with more information on this capability.
Fig. 5: Synopsys Verdi debug and verification management links.
In addition to back-annotating coverage results to the verification plan and the design specification, Synopsys Verdi provides links to common requirements management systems such as Jenkins and Jama. Thus, Synopsys Verdi Planner enables traceable verification across the entire development flow. The execution manager enables comprehensive regression management, collecting results for automatic coverage merge. Debug is integrated smoothly with the verification management features, using a common GUI for a seamless experience.
This powerful solution is available now and has been deployed successfully by many users. A recent blog post on the experiences of MediaTek reported that Synopsys Verdi “significantly helps engineers reduce time spent on root-cause analysis of regression failure, from days to minutes.” There is no reason to suffer with outdated manual regression management methods or to juggle multiple tools to try to automate the process. Synopsys Verdi addresses today’s verification challenges and is fully scalable for tomorrow’s as well.
Leave a Reply