Everything, Everywhere, All At Once: Big Data Reimagines Verification Predictability And Efficiency

Utilizing the data from regression runs to suggest starting points for debug.

popularity

Big data is a term that has been around for many years. The list of applications for big data are endless, but the process stays the same: capture, process and analyze. With new, enabling verification solutions, big data technologies can improve your verification process efficiency and predict your next chip sign-off.

By providing a big data infrastructure, with state-of-the-art technologies, within the verification environment, the combination of all verification metrics allows all resources to be used as efficiently as possible and enables process improvements using predictive analysis.

Predictive analytics is a part of machine learning (ML) and is one of the major fundamentals that make the concepts of big data extremely interesting when applied to the verification process.

Data mining and predictive analysis

Maturing data analytics is driving the demand for automated decision-making based on reliable predictive analytics (AI/ML). Figure 1 highlights the maturing of data and analytics, illustrating how we have moved from descriptive and diagnostic analytics to predictive and prescriptive analytics, resulting in the unleashing of the full potential of big data.

Fig. 1: Analytic maturity model.

Descriptive analytics, by way of reports and dashboards, describe the dimensions and measurements of any aspect of the process, enabling users to report on goals and actual values of any metrics.

Diagnostic analytics use more advanced and critical capabilities, such as interactive visualization, to enable users to drill more easily into the data to discover new insights. Here the user is assumed to be more analytical, using the tool provided to mine the data in the desired way.

Predictive analytics, applied to rich historical data sets, can predict outcomes at a future time. Prescriptive analytics, which use insights from the predictive models, are integrated into processes to take corrective or optimizing actions.

The debug cycle, and where to start it, are two of the most time-consuming aspects within the verification process. With all coverage and assertion data in one place, it is possible to utilize the data from regression runs to suggest starting points for debug.

The basic principle is that if all tests that cover a particular coverage point passed, the chances of that coverage point being the cause of an error is very low. If all tests that cover a given coverage point failed, and none of the tests that did not cover the same coverage point failed, the chances of the coverage point in question being the cause of the error is very high. Taking tens of thousands of these coverage points in a typical design, it is possible to pinpoint likely causes of the failures.

This algorithm works along with a triaging process to pinpoint a single error as a starting point. The assumption is that we require significant numbers of passing and failing tests. The tests should not produce 100% coverage, and the coverage of the various tests should have minimal overlap with coverage from other tests. Essentially, what we want to calculate for each coverage point is the probability of the result of a given test being a failure if a given coverage point is covered by said test. A rating can be computed for each coverage point which tells us how much additional information about the overall pass/fail status of the tests is contributed by each coverage point, and how closely does the signature of a given coverage point match the pass/fail signature of the regression suite. The coverage point with the closest correlation to the pass/fail status signature is probably the coverage point closest to at least one cause of the failures. This is an example of what is possible in data mining when all information from the regression is gathered in one place.

An all-encompassing, enabling solution

Data mining and predictive analysis can be applied to the data within the verification process. Providing a big data infrastructure or platform with state-of-the-art technologies for the verification environment allows the mixture of all verification metrics and can truly allow all resources to be used as efficiently as possible, significantly improving the predictability of the entire process.

Fig. 2: Collaborative, data-driven verification transforms the verification process using analytics, collaboration, and traceability.

Questa Verification IQ is a data-driven verification solution which not only provides a central metrics platform but also incorporates applications for scalable verification management using a web framework. It natively supports collaboration, transforming the verification process using analytics and traceability.

By providing a big data infrastructure, with state-of-the-art technologies, within the verification environment, the combination of all verification metrics allows all resources to be used as efficiently as possible and enables process improvements using predictive analysis.

The solution initially consists of four applications.

Testplan Author is a plan and requirements driven front-end, providing a collaborative testplan editor to capture the testplan that drives the verification process.

Regression Navigator is the collaborative front-end to our regression engine, providing visibility and controllability of regressions in a device and OS independent interface.

Coverage Analyzer accelerates coverage closure by applying analytics to the coverage closure problem.

Verification Insight provides the ability to build project metric dashboards. Verification Insight stores data throughout the verification process and across multiple projects, with features for historical data management. This allows ML based predictive and prescriptive analytics to be applied to accelerate and improve the verification process.

Conclusion

Big data is not just about harvesting vast amounts of data but also allowing the exploration of the different relationships you can start to see by combining data captured from the verification process. Using some of the readily available technologies, it is possible to start to gather very valuable data, which can help improve your verification process efficiency and predict chip sign-off in today’s and, more importantly, tomorrow’s projects.

For an in-depth treatment of this topic, including a primer on the history and applications of big data and the technologies, metrics, and processes that deliver its power into your hands, please read the recent Siemens whitepaper Improving verification predictability and efficiency using big data.



Leave a Reply


(Note: This name will be displayed publicly)