Systems & Design
SPONSOR BLOG

Clock Domain Crossing Signoff Through Static-Formal-Simulation

A coverage-driven hybrid approach of static and dynamic verification for functional CDC verification closure.

popularity

By Sudeep Mondal and Sean O’Donohue

Clocking issues are one of the most common reasons for costly design re-spins. This has been the driving factor in the ever-increasing demand for Clock Domain Crossing (CDC) analysis tools. Today, the majority of IP and SoC teams are focusing on “Structural CDC” analysis, which is important but not sufficient. Structural CDC analysis ensures that the design has the structures (e.g. synchronizers) in place to ensure metastability effects (caused by asynchronous clocks) are managed well. Also, structural CDC analysis is typically driven under some user constraints/assumptions. There is a need to validate these constraints/assumptions for signoff reasons or risk missing silicon bugs. Further, as designs also need to adhere to pre-defined (functional) CDC protocols to ensure safe data transfer across asynchronous clock domains, so called functional CDC verification needs a multi-pronged approach through a mix of static, formal and simulation techniques.

Dynamic verification or simulation uses a test-bench approach to functionally verify that the design performs to its specification by observing outputs based on certain meaningful stimuli. On the other hand, static verification verifies the design by structural analysis, logical evaluation and verification of implicit/explicit properties of the design using formal methods. The comparative benefits and costs of these two types of approaches is captured in the below table:

For functional CDC verification closure, a coverage driven hybrid approach of static and dynamic verification is recommended. This flow can bring out the best of static and dynamic verification and can help augment faster signoff. The approach is based on automatic generation of verifiable properties for assumptions, constraints and waivers used for static verification so that they can be used for validation during dynamic verification.

CDC Signoff Verification Flow


Fig 1: Verification flow diagram

The flow is composed of three distinct steps:

1. Static Verification Methodology: The design assumptions are encapsulated in the form of constraints and structural analysis is performed. A complete and thorough validation effort requires structural analysis followed by formal validation. This is required since structural analysis can only check the structures but not actually validate if that structure will work properly for all cases or not.

For example, in Figure 2, a control bus changing domains needs to be gray-encoded to avoid coherency issues.


Figure 2: A bus crossing clock domains

In such a case, structural analysis can only detect the bus crossing and its surrounding logic but will not be able to check if the signals are gray-encoded or not. This is because a gray-encoder can be built using multiple different circuits and identifying each type of circuit structurally is not possible for any static tool. Even if a gray-encoder is successfully identified structurally, the circuit itself may have a bug (for example, a faulty reset that can switch the output to 00 randomly thus inducing fails such as a transition between 11->00).

Validating the gray-encoding circuit through simulation is as good as the quality of the testbench. This is where formal validation can play a strong role since it can do an exhaustive validation using formal techniques, keeping the validation process agnostic to the actual structure used in the design.

Formal verification can report either a Pass, a Fail or a Partial proof. A Passing assertion is good and doesn’t need to be debugged further. A Failing assertion can be debugged by checking the counter-example in the waveform viewer. However, it is the Partial proved properties that need further analysis and closure.

2. Design Intent IP Generation: The design intent creation needs to happen for the following cases:

  • Assumptions – A database of all the assumptions (provided as user constraints) should be generated. Typical examples of such constraints are clock (the signal behaves like a clock), reset (the signal is used to reset the design), case analysis (signal is static at a particular value) and quasi static signals (used to define the behavior of configuration registers and other nets that are not expected to change too much in the functional mode). It is important to verify the correctness of these constraints in the simulation environment since a break in their behavior can result in missing CDC bugs and potential silicon failure.For example, if a signal has been incorrectly specified as static at value 1 and if it toggles to 0 in the functional mode, then the static analysis results may be inaccurate. Similarly, if a signal has been specified as quasi static but a change in it can impact a crossing, the constraint is invalid, and its use can lead to incorrect static analysis.
  • Waivers – Although the recommended methodology is to apply constraints, there can be certain unavoidable situations where the user may be forced to apply a waiver instead. For example, if a signal is going into multiple blocks and only one particular block needs to be handled differently, then waiver application on the violation remains the only option. If such waivers are written with a reasoning (gray- encoded signals), then again assertions can be derived for such waivers and they can be validated in the simulation environment.
  • Protocols — Complex protocol checks like data loss in fast to slow crossings and data-enable sequencing checks cannot be detected structurally. The problem needs to be modeled in terms of a state machine and passed to formal engines. However, formal verification may result in partial proofs. It is imperative to verify these remaining partially concluded properties through simulation, in order to achieve faster closure.

Following example shows a gray-encoding check that failed in simulation showing the following waveform:


Figure 3: Gray-encoding failure in simulation

3. Coverage driven verification closure: In this step, the user completes simulation on the generated database and uses the coverage report to sign off the verification. There are two approaches to verify functional CDC properties:

  1. Formal-driven Verification
  2. Simulation-driven Verification

Formal-driven verification coverage can be increased using a mixture of incremental analysis, higher solver time and a combination of different formal techniques. This approach has its own advantages – it is more exhaustive and has no dependency on user testbench. However, it requires an upfront understanding of what the designer wants to achieve out of the analysis and a plan to accommodate the required time and resources.

Simulation is not exhaustive, but it reduces the barrier to adoption. Moreover, it can work well at the SoC level.

Hence, a hybrid approach of formal and simulation verification needs to be adopted for faster design verification closure. Doing this will help reduce the number of iterations required for complete design validation and help minimize the risk of costly silicon re-spins.

Sean O’Donohue is a principal application engineer at Synopsys, responsible for static and formal technologies. Prior to joining Synopsys, Sean was a corporate application engineer at Atrenta and a technical marketing lead at Springsoft. He holds a Bachelor’s in electrical engineering from Cal Poly Pomona.



Leave a Reply


(Note: This name will be displayed publicly)