Formal Is Set To Overtake Simulation

Experts at the Table, part one: Formal verification used to lurk in the backgrounds, but no more. Experts from formal companies are ready to take on simulation and to start to pave the way forward.

popularity

There has been a significant psychology change in the area of formal verification over the past couple of years. It’s no longer considered a fringe technology, and it’s no longer considered difficult to use. In fact, it has become a necessary part of the verification process.

Semiconductor Engineering sat down with a panel of experts to find out what caused this change and what more we can expect from formal verification technologies in the future. Participating in the panel were Pete Hardee, director of product management for Incisive Formal at Cadence; Oz Levia, vice president of marketing and business development at Jasper Design Automation; Pranav Ashar, CTO at Real Intent; Dave Kelf, marketing director at OneSpin Solutions; and Vigyan Singhal, CEO of Oski Technology. What follows are excerpts of that discussion.

roundtablephoto

SE: What was the catalyst for this change?

Singhal: Capacity and methodology are the enablers. We now have tools that can scale and people have learned how to apply formal by breaking up the problem. The community has come up with the tips and tricks necessary and finally sign-off has become real in the past few years.

Hardee: In addition, we think that verification apps have been a key. Apps enable you to solve verification problems for the masses rather than for a few experts. A large barrier to the introduction of formal technology has been assertions. Verification people, and in particular designers, do not know how to or do not like writing assertions. Apps automate the problem and the creation of the assertions targeting a particular verification problem.

Kelf: We can consider what we have to be the second generation of formal. The first generation was the unwieldy, somewhat academic technology, and then we saw this rapid shift toward application-focused methodologies. Companies like Real Intent blazed the trail with Clock Domain Crossing (CDC) solutions, and this is what I call the second generation. The question now is how we move to the third generation, where hardcore assertions are required.

Ashar: Capacity has certainly improved, and in a way that is far beyond the increase in speed of computers. The second change is that people have learned where to apply formal and to use it where it can make a real difference. The best places are where the scope is narrow and there is a full understanding of the failure modes.

Kelf: And there are places where simulation just can’t scale.

Ashar: Yes, CDC is a great example of that. The failure mode is a confluence of functionality and timing, and when these two things happen asynchronously, simulation falls apart. There are an increasing number of areas like this.

Levia: In addition to capacity, there has been a lot of algorithmic improvement. There is no such thing as one formal engine. We have over a dozen different proof engines, and this enables us to migrate from the small and esoteric to the large and complex problems. When we look at the problems that formal can address we see that the scope is almost unlimited. It is not just that simulation doesn’t scale. There are problems that simulation cannot solve—dynamic power verification, security, sequential equivalence checking, to name just a few. Formal is not only appropriate, but required. Formal has moved from a ‘nice to have’ to a ‘must have.’ The other thing that has changed is that success begets success. Formal has a high ROI, and that affects the business model. The high selling price associated with that attracts the attention of the decision makers. That ensures it is proliferated throughout an organization and success achieved. We predict that formal will eventually become the dominant verification approach with the assistance of simulation.

SE: Early success in areas such as CDC were in part because you didn’t have to write assertions, but to get to higher levels of verification, assertions are necessary. How do we overcome this barrier?

Singhal: Assertion languages are just SystemVerilog. You can start with something very basic and graduate slowly. You also can write reference models, scoreboards and full end-to-end proofs for things such as bus controllers, processors, video decoders, etc. Once you have learned the basics, you get increasing amounts of success and this makes you willing to invest more into it.

Ashar: Assertions languages are not just about writing checkers, but also constraints. There are three things that need to be addressed. The first is specification and by narrowing the scope of the problem, the checking is implicit from understanding the failure modes. So you don’t need assertions to write the checkers, only the constraints. Secondly, limiting the scope allows us to control the complexity of the problem. And third, it enables us to debug in an informed manner that is actionable.

Levia: We see assertions coming from multiple sources. Perhaps the most important one today is verification IP. These often come with checkers, with constraints, and now we are seeing proof packages for common protocols. These alleviate the need for people to recreate them. The next source is apps that are able to generate their own assertions from higher level specifications. Examples are connectivity and X-propagation. Then we can mine simulation data for assertions. Finally, there are the assertions that people write themselves. We no longer see assertions as the limitation.

Ashar: You have to ensure that you do not generate too many low-quality assertions, which can be a danger from mining simulation data. This danger can be avoided by knowing the goal upfront.

Kelf: When does assertion writing go mainstream? The things that Oz talked about help because you can look at these and learn from them. But we also believe that we need better tools to help write assertions.

Hardee: We are talking about formal as if it is in competition with simulation, and this is not really the case. They complement each other. What people are trying to do is achieve verification closure based on a verification plan. So measuring contribution is important. Consider coverage unreachability. You can simulate to a certain point and then analyze the points that remain to be covered to see if they are reachable. This provides a huge ROI as it targets the places where directed tests would be hard to write. Register map validation is another example where assertions can be generated from other specs, in this case from IP-XACT.

Singhal: You have to pay attention to the entire pyramid. Not everyone has to be able to write assertions. There are many people using apps that do not require assertions, but there will be some people who are experts in assertions and they are needed because without them the bottom layer will falter. In the past, without these people, the assertions were only the easiest to write and because of this they did not see the ROI.

Ashar: Simulation is a fallback option in any system. The more you understand a problem, the more you can apply a formal analysis. As the design steps become more understood, there are more areas that are amenable to formal solutions.

Kelf: But there is an issue between formal and simulation. If we look at the standards being created today, using the assertions in UVM is a nightmare. UVM was created without formal in mind. On top of that we have UCIS, which was intended to bring together coverage information. It is also simulation-centric. A realization by the industry and the standards bodies that formal has a place would be useful.

Singhal: UVM is there and we have to fit into a simulation world and it can handle many of the things that we need to do.

In part 2 of this Experts at the Table series, panelists dissect the problems with the standards and the areas in which they need to be updated.



4 comments

Formal Is Set To Overtake Simulation | The Best... says:

[…] Formal Is Set To Overtake Simulation Experts at the Table, part one: Formal verification used to lurk in the backgrounds, but no more. Experts from formal companies are ready to take on simulation and to start to pave the way forward.  […]

[…] successful, but that formal continued to grow in acceptance. Today, many are asking the question: Is formal set to overtake simulation? Cadence clearly believes that being the number one in the formal technology space is a good […]

[…] part one of this roundtable, the panelists talked about the recent changes that have brought formal to the […]

[…] part one of this roundtable, the panelists talked about the recent changes that have brought formal to the […]

Leave a Reply


(Note: This name will be displayed publicly)