Is Formal Ready To Displace Simulation?

Without logic equivalence checking, design compiler may never have been the success that it was. Is the success of electronic system level (ESL) design dependent on sequential equivalence checking?


In part one of this roundtable, the panelists talked about the recent changes that have brought formal to the forefront of verification and discussed the challenges that the UVM have brought to formal. In part two, the panel focused on the subject of coverage and the ways in which formal coverage can be combined with simulation. In this segment we start exploring the impact that sequential equivalence checking will have on design practices. Participating in the panel were Pete Hardee, director of product management for Incisive Formal at Cadence; Oz Levia, vice president of marketing and business development at Jasper Design Automation, which is to be acquired by Cadence; Pranav Ashar, CTO at Real Intent; Dave Kelf, marketing director at OneSpin Solutions; and Vigyan Singhal, CEO of Oski Technology. What follows are excerpts of that discussion.

SE: What progress has been made with sequential equivalence checking? Is this a necessary technology to have before ESL methodologies can take off?

Kelf: Yes, it is a necessary technology. High-level synthesis (HLS) performs a lot of sequential optimization, and equivalence checking has to keep pace with it. You need something that can handle registers being moved around.

Singhal: We are seeing sequential equivalence checking (SEC) being used to verify between C and RTL design. These are mainly data transformation types of design, such as wireless baseband or video codec. All of these have a reference C model. Sure there are some restrictions, such as you cannot have unbounded loops, dynamic memory allocation, etc., but we can do it.

Ashar: I don’t think you want to frame this as a reaching for the moon type of problem. You have to divide these problems into smaller easier parts in either space or time. The implementation of an SoC is a sequence of steps. SEC is applied to a single refinement step and clock gating is an example of that. You have the before and after of that step and you can easily check the transformation.

Levia: There are several types of problems here. The first is the C to RTL reduction and is the most complex of the problems. Then there is the optimization type which is RTL to RTL. Even within each of these there are two or more varieties. There are the automatic transformations and then there are manual transformations or reductions. There could be the introduction of an asynchronous protocol, pipeline retiming etc. There are so many small changes in the RTL that would give chills to the engineers if they had to go back and verify them using a verification suite based on simulation. SEC, with the appropriate mapping, can help them converge and convince themselves that functionally in the design is identical. Because of this, we see an explosion in equivalence checking. People are driven there because of the efficiency.

Singhal: Yes, and that means they can be even more aggressive with the optimizations. Before, they may have been afraid to change things.

Ashar: a good example is about how you initialize a circuit. A plain way to do it is to put an asynchronous reset on every clock. This can be checked structurally quite easily. But if you have a tool that can check a more sophisticated reset scheme, then you can save on power, you can save on layout resources.

Levia: It is opening up horizons that people didn’t imagine. I am a little more skeptical when it comes to high-level synthesis. Designers will fall into two camps. They will either believe that HLS is good or, like with Design Compiler, they will not trust it to be correct. I don’t think SEC will fundamentally change how they see HLS.

Hardee: Everyone still does not trust Design Compiler and uses equivalence checking.

Kelf: It is a problem specific issue. With data paths, lots of people are using HLS and trusting it. One issue is whether they can visualize the results. This was part of the issue with Design Compiler. As soon as an engineer got a good feeling about what came out in relation to what went in, then they started to work with synthesis.

Ashar: SEC enables HLS. But more important is the need to able to take an assertion on RTL and map that back to the C description. The same is true of coverage holes. You need to be able to map things back to a higher-level description.

SE: It is clear that the whole design will never be fed through HLS, but will be the integration of a large number of blocks. How do we verify these designs?

Hardee: What we haven’t talked about is connectivity, and if we look at SoC level connectivity, simulation is inefficient. Formal is very good at it. This is perhaps one of the most proliferated formal apps.

: The delivery of design IP has a problem, and that is with quality. We are at the stage now where the design IP can be completely verified formally. We need to get to the stage where they provide the IP and a full formal proof that can be replayed at the customer site. Then if I give you the rights to modify the RTL then the proofs can be replayed. So far, design IP companies have not invested enough in formal.

Levia: Taking formal up from unit to subsystem and up to system, connectivity is one type of application that exists everywhere, from the lowest mux connections to functional connections. Another application is register mapping. At almost every level of design there is a need to verify that memory or register accesses are done correctly. Architectural specification is another. If you have a spec for something that is 500 pages long, you need to find a way to formalize it, and then it is easy to start finding internal problems, conflicts, deadlocks, etc. Formal is a Swiss army knife. There is a lot of stuff you can do with it and almost every quarter, someone finds a new way to use this technology.

Ashar: Connectivity between blocks, about how registers are hooked up, is all about how you put things together rather than what’s inside. The low hanging fruit here is to be able to infer all of the checking you have to do automatically. UVM can help here. The industry need to think about how things such as protocols can be specified so that they can be mined to extract the verification obligations.

Kelf: It is interesting that each time we start by saying that the opportunities are where simulation has problems, and connectivity is one of those. As we get more blocks being interconnected and the blocks are being bound in terms of size, then maybe we can apply formal more effectively than simulation for a greater percentage of the problem and, over time, simulation will start to get pushed out. That is the future.

Singhal: Formal has come a long way, but the methodology needs to be there and you have to pay attention to the organization that you have, the pyramid you build inside of the company in terms of experts and make sure you give them the right training and methodology and tools. It is exactly the same as racing a car. You can give them the best technology, but you have to train them to be able to use it well.