Systems & Design
SPONSOR BLOG

Unknown Signoff

Unnecessary X propagation is costly, painful and can permit functional bugs to slip in.

popularity

In last month’s blog, Pranav Ashar, CTO at Real Intent, pointed out that the management of unknowns (X’s) in simulation has become a separate verification concern of signoff proportions. Modern power management schemes affect how designs are reset (start). X management and reset analysis are interrelated because many of the X’s in simulation come from uninitialized flip-flops and, conversely, the pitfalls of X’s in simulation compromise the ability to arrive at a clear understanding of the resetability of a design.

The SystemVerilog standard defines an X as an “unknown” value, which is used to represent when simulation cannot definitely resolve a signal to 1, 0, or Z. Synthesis, on the other hand, defines an X as a “don’t care,” enabling greater flexibility and optimization. Unfortunately, Verilog RTL simulation semantics often mask propagation of an unknown value by converting the unknown to a known, while gate-level simulations show additional X’s that will not exist in real hardware. The result is that bugs get masked in RTL simulation, and while they do show up at the gate level, time consuming iterations between simulation and synthesis are required to debug and resolve them. Resolving differences between gate and RTL simulation results is painful because synthesized logic is less familiar to the user, and X’s make correlation between the two harder. The verification engineer must first figure out whether the X in gate-level simulation is genuine before figuring out whether there is a bug in the design. Unnecessary X-propagation thus proves costly, causes painful debug, and sometimes allows functional bugs to slip through to silicon.

Continued increases in SoC integration and the interaction of blocks in various states of power management are exacerbating the X problem. In simulation, the X value is assigned to all memory elements by default. While hardware resets can be used to initialize registers to known values, resetting every flop or latch is not practical because of routing overhead. For synchronous resets, synthesis tools typically club these with data-path signals, thereby losing the distinction between X-free logic and X-prone logic. This in turn causes unwarranted X-propagation during the reset simulation phase. State-of-the-art low power designs have additional sources of Xs with the additional complexity that they manifest dynamically rather than only during chip power up.

Lisa Piper, from Real Intent, presented on this topic at DVCon 2012 and she described a flow in her paper that mitigates X issues. The flow is reproduced here.

The paper describes a solution to the X-propagation problem that is part technology and part methodology. The flow brings together structural analysis, formal analysis, and simulation in a way that addresses all the problems and can be scaled. In the figure above, it shows the use model for the design engineer and the verification engineer. The solution is static analysis centered for the design engineer and is primarily simulation-based for the verification engineer. Also, the designer-centric flow is preventive in nature while the verification flow is intended to identify and debug issues.

Lisa gave further background in a video interview at DVCon, where she talks about the various alternatives to handling X-issues. You can see it here.

My guess is that we will see further developments in this new and separate signoff concern. Stay tuned.



Leave a Reply


(Note: This name will be displayed publicly)