A coverage-driven hybrid approach of static and dynamic verification for functional CDC verification closure.
By Sudeep Mondal and Sean O’Donohue
Clocking issues are one of the most common reasons for costly design re-spins. This has been the driving factor in the ever-increasing demand for Clock Domain Crossing (CDC) analysis tools. Today, the majority of IP and SoC teams are focusing on “Structural CDC” analysis, which is important but not sufficient. Structural CDC analysis ensures that the design has the structures (e.g. synchronizers) in place to ensure metastability effects (caused by asynchronous clocks) are managed well. Also, structural CDC analysis is typically driven under some user constraints/assumptions. There is a need to validate these constraints/assumptions for signoff reasons or risk missing silicon bugs. Further, as designs also need to adhere to pre-defined (functional) CDC protocols to ensure safe data transfer across asynchronous clock domains, so called functional CDC verification needs a multi-pronged approach through a mix of static, formal and simulation techniques.
Dynamic verification or simulation uses a test-bench approach to functionally verify that the design performs to its specification by observing outputs based on certain meaningful stimuli. On the other hand, static verification verifies the design by structural analysis, logical evaluation and verification of implicit/explicit properties of the design using formal methods. The comparative benefits and costs of these two types of approaches is captured in the below table:
For functional CDC verification closure, a coverage driven hybrid approach of static and dynamic verification is recommended. This flow can bring out the best of static and dynamic verification and can help augment faster signoff. The approach is based on automatic generation of verifiable properties for assumptions, constraints and waivers used for static verification so that they can be used for validation during dynamic verification.
CDC Signoff Verification Flow
Fig 1: Verification flow diagram
The flow is composed of three distinct steps:
1. Static Verification Methodology: The design assumptions are encapsulated in the form of constraints and structural analysis is performed. A complete and thorough validation effort requires structural analysis followed by formal validation. This is required since structural analysis can only check the structures but not actually validate if that structure will work properly for all cases or not.
For example, in Figure 2, a control bus changing domains needs to be gray-encoded to avoid coherency issues.
Figure 2: A bus crossing clock domains
In such a case, structural analysis can only detect the bus crossing and its surrounding logic but will not be able to check if the signals are gray-encoded or not. This is because a gray-encoder can be built using multiple different circuits and identifying each type of circuit structurally is not possible for any static tool. Even if a gray-encoder is successfully identified structurally, the circuit itself may have a bug (for example, a faulty reset that can switch the output to 00 randomly thus inducing fails such as a transition between 11->00).
Validating the gray-encoding circuit through simulation is as good as the quality of the testbench. This is where formal validation can play a strong role since it can do an exhaustive validation using formal techniques, keeping the validation process agnostic to the actual structure used in the design.
Formal verification can report either a Pass, a Fail or a Partial proof. A Passing assertion is good and doesn’t need to be debugged further. A Failing assertion can be debugged by checking the counter-example in the waveform viewer. However, it is the Partial proved properties that need further analysis and closure.
2. Design Intent IP Generation: The design intent creation needs to happen for the following cases:
Following example shows a gray-encoding check that failed in simulation showing the following waveform:
Figure 3: Gray-encoding failure in simulation
3. Coverage driven verification closure: In this step, the user completes simulation on the generated database and uses the coverage report to sign off the verification. There are two approaches to verify functional CDC properties:
Formal-driven verification coverage can be increased using a mixture of incremental analysis, higher solver time and a combination of different formal techniques. This approach has its own advantages – it is more exhaustive and has no dependency on user testbench. However, it requires an upfront understanding of what the designer wants to achieve out of the analysis and a plan to accommodate the required time and resources.
Simulation is not exhaustive, but it reduces the barrier to adoption. Moreover, it can work well at the SoC level.
Hence, a hybrid approach of formal and simulation verification needs to be adopted for faster design verification closure. Doing this will help reduce the number of iterations required for complete design validation and help minimize the risk of costly silicon re-spins.
Sean O’Donohue is a principal application engineer at Synopsys, responsible for static and formal technologies. Prior to joining Synopsys, Sean was a corporate application engineer at Atrenta and a technical marketing lead at Springsoft. He holds a Bachelor’s in electrical engineering from Cal Poly Pomona.
Leave a Reply