Does Formal Have You Covered?

While most verification standards did not consider formal, they provide a wealth of data that can be mined. Questions still remain about when you have done enough verification.

popularity

In part one of this roundtable, the panelists talked about the recent changes that have brought formal to the forefront of verification and discussed the challenges that the UVM have brought to formal. In this segment we start exploring those difficulties in more detail and the progress made with integrated coverage. Participating in the panel were Pete Hardee, director of product management for Incisive Formal at Cadence; Oz Levia, vice president of marketing and business development at Jasper Design Automation, which is to be acquired by Cadence; Pranav Ashar, CTO at Real Intent; Dave Kelf, marketing director at OneSpin Solutions; and Vigyan Singhal, CEO of Oski Technology. What follows are excerpts of that discussion.

Untitled

SE: Let us continue talking about assertion generation. Where do assertions come from?

Ashar: we talked about assertions being generated automatically for CDC, power, timing constraints. There is collateral information available in things other than the RTL. For example you have UPF and this can be mined for information used to create assertions. UVM adds testbench collateral that you can mine and generate functional checks.

Hardee: But was UVM built with that in mind?

Ashar: No, but neither was SDC or UPF built with that purpose in mind.

Levia: UPF was built specifically for the specification of power domains, and IP-XACT was built with the idea of defining interfaces, and SystemVerilog was built with simulation in mind. They didn’t even have synthesis in mind. There are types of properties that are not even expressible in SystemVerilog. But is this hindering the development of formal? No. People find ways around it. There are ways to extract assertions from all of these forms. UVM is not a hindrance but it not helping either.

SE: Where are we with coverage and specifically formal coverage?

Levia: Coverage is a little thornier. We have a standard, but the support that the large simulation vendors give to it is parochial. This is unfortunate. Thankfully, there are other ways to integrate coverage results. We can talk about coverage in several different ways. One way is telling the user about the quality of the testbench. How much of the state space and the cones of influence are covered. This also applies to constraints and there are ways that we can show the user where he may have to take a closer look. Then there is coverage analysis based on full proofs. Here, formal is far more comprehensive than simulation. With a full proof you can say something definitive about the coverage of the state space. What is perhaps more interesting is how you apply information that comes from bounded proofs. There are ways that we can show the user the difference between the relative bound and the theoretical bound. This can be used to assist the user in breaking up a big problem into several smaller ones that can converge into coverage. Lastly we have to work out how to integrate this will the coverage information they have coming from other sources, other companies, other tools.

Ashar: The formal community has been too focused on proving properties, and by doing this they have failed to realize that there is so much information in the intermediate results that can be fed to the user to get a better handle on coverage. The information generated from bounded results is very important and by not exploiting this we are missing out on some low hanging fruit.

Kelf: So far we have really been talking about control coverage. Have we reached the right parts of the design? Very important, but we also have to ask, have enough assertions been written to cover the design properly? If a problem exists in the design, will it be discovered by the assertions? We need to add this type of coverage on top of everything else.

Hardee: At the end of the day we still have an education issue. The manager wants to know, ‘Have we done enough verification to know if my chip is going to work or not?’ Most people understand that if you complete a proof, it is absolute. But the verification managers don’t understand things like bounded results. They want data presented to them in a form that they can understand. What is my contribution from formal, what is my contribution from simulation; now what is my coverage on the chip and do I believe that we are good to go?

Kelf: That addresses the fundamental issue: how do we make formal for the masses?

Singhal: A couple of years ago it was meaningless to talk about the integration of formal coverage. Today, the way in which formal coverage has been defined and implemented in the tools means exactly the same thing in simulation. When coverage shows that you are 50% done, it means the same thing in either domain.

SE: If I were to buy coverage tools from all of you, can I integrate your coverage results today?

Ashar: Coverage needs to be defined in the context of the problem you are solving. In an earlier talk we heard about how a user had finished simulation and then found another twenty bugs using a CDC tool.

Hardee: Measuring the bugs found is very important and is a great way of justifying the ROI of formal. But that isn’t the verification manager’s problem. His problem is about the bug left in the design.

Levia: People have a double standard. Nobody questions simulation coverage and its completeness, but when they talk about formal the first question on their mind is how do you know you are complete? With simulation coverage you have no idea what it means, how far it reached, the state space it covered or what is the actual coverage. It is not as important that we as vendors coalesce our coverage, but that to the user we bring our coverage information together.

Hardee: We are supporting UCIS as far as it goes, but it doesn’t go far enough to be able to define the coverage metrics and assurances that verification managers are looking for. It also doesn’t go far enough to enable us to present a multiple vendor solution.

Kelf: I agree with that. UCIS needs a lot more work to be all encompassing. UCIS is not designed for formal but we can layer some stuff in there.

Singhal: The data that comes from different vendors all have some slight differences. It is difficult to get them to agree.

Levia: Are we asking for perfection? What the user wants are the leading simulation providers and the leading formal providers to give them a unified view of coverage targets and results. It is happening.

Ashar: I go back to my statements from earlier. Simulation always provides you with some data. With static tools, the user can spend a lot of time specifying the problem but sometimes it returns nothing. This is a gap between static and dynamic methods that needs to be fixed. Getting coverage data out of bounded results and information about constraints is important. The user should be given something actionable to do. If formal has said something is unreachable, you should not try and reach it in simulation. We need similar things for other situations.

Hardee: This brings us back full circle in that we have talked about the improvements in the things that we can prove, in terms of the algorithms we are applying, the abstractions that can be made and about the education in the formal community about the types of problems that formal will complete on.

In the final part of this roundtable we will be looking at sequential equivalence checking.



1 comments

Does Formal Have You Covered? | The Best Of Ver... says:

[…] Does Formal Have You Covered? While most verification standards did not consider formal, they provide a wealth of data that can be mined. Questions still remain about when you have done enough verification.  […]

Leave a Reply


(Note: This name will be displayed publicly)