Intelligent Verification Offers Hope For “Smartening” Up Verification

New tools focus on shrinking non-recurring engineering costs while reducing potential bugs deep within a design.

popularity

By Cheryl Ajluni

As with death and taxes, when it comes to design some things are just inevitable. For one, as design geometries shrink, design complexity will continue to increase. For another, verification is the single most time-consuming and intensive part of the entire design cycle.

While new tools and methodologies have enabled designers to work through many of the existing complexity issues to design innovative new electronic products, the ability to verify resulting designs has not followed suit. Despite ongoing efforts to improve verification technology so that it can keep pace with design creation, a verification bottleneck still exists. Intelligent verification or intelligent testbench—a concept that uses the design to automatically direct test generation—now offers design engineers real hope of a breakthrough in addressing this problem. But what exactly does intelligent verification mean and how will it benefit today’s designers? Let’s take a closer look.

Under the hood

Many factors are motivating interest in a new verification solution, including the need to increase verification confidence, track hard to reach design areas and eliminate test redundancy. Achieving confidence that a design is functionally correct typically requires the verification engineer to make an instinctual call that enough verification has been performed—an approach that often leaves the design vulnerable to bugs residing deep within its code structure and which will likely not be found until after fabrication. Increasing design size and complexity also plays a role here as it requires more testbench and assertion code to be written that can lead to redundant tests, and which itself requires verification.

Constrained random simulation methodologies offer a way to improve verification quality and time using hardware verification languages, but constrained random testing does not scale well to large designs. And because it bases test generation on external input rather than design structure, a disconnect exists between coverage metrics and test input, making it impossible to close the loop from coverage back into test creation.

Mark Olen, product manager of Mentor Graphics’ Advanced Functional Verification group, points to another issue with constrained random testing: “Verification toolsets focused on constrained random test generation promote their ability to generate more simulation sequences than ever before, covering more design functionality than even large verification teams can test with directed test sequences. But is quantity alone enough? Real advances in design productivity have not been based on just producing more raw gates, or more raw transistors, merely for quantity sake. They have come from doing things smarter. It’s time for verification to get smarter. Should we generate more test sequences? Sure, but let’s also make sure that each test sequence is useful and tests something new and important.”

This is exactly where intelligent verification comes in. As a form of functional verification, it can be used to verify that a design conforms to specification prior to fabrication. It performs this task using information derived from the design to automatically update the test description to target design functionality not verified, or already covered by existing tests. Algorithms automate the generation of simulation sequences, data and checks from a concise behavioral description of a design’s specifications.

As a supplement to constrained random simulation methodologies, intelligent verification uses existing logic simulation testbenches, along with the design, to automatically determine how to maximize coverage. It also provides the user with direction as to why certain coverage points were not detected. The benefits of this approach are significant. Intelligent verification does the following:

  • Increases functional coverage at the module, sub-module and system level, and finds more bugs faster than traditional methods, in turn reducing overall testbench programming;

  • Enables project teams to design with a higher level of confidence, realize improved design quality and dramatically minimize design respins;

  • Avoids the unnecessary random wandering of constrained random test methods to find a bug. Instead, it creates new tests by automatically tracking efficient traces through the design structure to coverage points, and;

  • Ensures that no simulation cycles are wasted verifying items that have already been tested.

Options Ahead

While there has been substantial research into intelligent verification, commercial tools that leverage this concept are just beginning to emerge. One solution comes from Nusym Technology. No specific tool name has yet been cited by the company, but it claims that unlike other approaches, which either treat the design as a “black box” or are impractical for use on large designs, its intelligent approach to verification uses insight into the design to automatically create “directed” tests and achieve rapid verification closure.

Another solution comes from Mentor Graphics. Its inFact intelligent testbench automation tool uses systematic algorithms to rapidly produce unique, non-redundant test cases (Figure 1). Users develop testbenches by specifying rules that are compiled into graphs. As Mark Olen explains, “inFact focuses on more verification per cycle, not just more cycles of verification, giving verification engineers the power to test much more functionality, with fewer test sequences. It can completely eliminate redundant sequence generation early in the verification process, achieving functional coverage over 10 times faster than constrained random testing. Later in the verification process, it can allow varying degrees of redundancy to create sequentially robust regression test suites.”

Figure 1. With inFact, intelligent algorithms are used to generate stimulus sequences, monitor the results and ensure generation of non-redundant sequences.

SpringSoft is another player in the intelligent verification market, although the company prefers to refer to it as “verification enhancement.” Recently, SpringSoft acquired Certess and the Certitude functional qualification system, which uses “mutation analysis” technology to inject artificial bugs (faults) into the design to check whether the verification environment catches them or not (Figure 2). When the verification environment does not catch an artificial bug, it’s an indication that the environment would likely miss a real bug in the same area of the design as the injected fault. According to Scott Sandler, president of SpringSoft USA, “At SpringSoft we don’t make a simulator; we make tools that enhance the simulation-based verification flow—such as our debug tools. Our recent acquisition of Certess fits directly into this category.”

Figure 2. The Certitude Functional Qualification System provides an overall assessment of verification quality, exposing a verification environment’s holes and weaknesses.

The Bottom Line

While intelligent verification offers a myriad benefits that will likely increase its rate of adoption, there are challenges ahead, not the least of which is that verification engineers may be reticent to change from what they’re already doing unless they are forced to. Regardless of when the change to intelligent verification comes, or even if this concept becomes commonplace in the electronics design industry, it is unlikely that verification will cease to be a top concern any time soon. As with death and taxes, it will continue to remain challenging to designers and design tool providers alike.



Leave a Reply


(Note: This name will be displayed publicly)