Formal’s Roadmap

Experts at the Table, part 1: Panelists look at the progress made by formal in the past five years and where we can expect to see new capabilities emerge.

popularity

Formal verification has come a long way in the past five years as it focused on narrow tasks within the verification flow. Semiconductor Engineering sat down to discuss that progress, and the future of formal technologies, with , president and CEO of OneSpin Solutions; Harry Foster, chief verification scientist at Mentor Graphics; Pete Hardee, product management director at Cadence; Manish Pandey, chief architect for new technologies at Synopsys; , CEO of Agnisys; and , president and CEO at Oski Technology. What follows are excerpts of that conversation.

roundtable1

SE: Formal is no longer considered a niche technology. How have the advances in formal been realized by semiconductor companies, and what impact has it had on their verification methodologies?

Singhal: I don’t think can be considered a niche technology, and hasn’t been for quite a while now. The important thing is to understand what you want to do with formal verification. Are you trying to solve local problems, which includes equivalence checking, (CDC), apps, assertions, block level verification? And depending upon what you want to do, it may be niche technology or not. There are plenty of problems where there is well-adopted technology. For many, formal is now part of the sign-off process. However, there are some problems, such as system verification, that are still using simulation.

Hardee: Formal has crossed the chasm and is well beyond niche at this stage. Our formal products are in use by 17 of the top 20 semiconductor companies. You don’t get there with it still being a niche technology. It has grown into something that is a “must have” for verification. The only remaining question is how much of the total verification task can it cope with. You still need different verification engines for different parts of the problem. If you are not using formal somewhere in at least IP or sub-system verification, then you are behind.

Pandey: Formal is no longer a niche. Every serious chip design house is using it. Equivalence checking was one of the first formal techniques, but we don’t call it formal anymore because it is successful. If you look at formal property verification, the methodologies for the way it is being used are becoming more standardized and there are formal IPs. One area that formal is not very appreciated for is that even when you look at problems such as connectivity, property checking and low power, there are technologies associated with formal being used underneath the hood to help improve the quality of results. That is sneaking in slowly and steadily. Push-button techniques tend to apply and people are being very successful with them. In many cases, they don’t even know they are using formal.

Foster: One remarkable thing that we have seen (ignoring equivalence checking and automatic apps that focus on certain problems) is that between 2014 and 2016 there was huge growth of formal property checking. Before 2014 it had been relatively flat. It grew about 26%. Roughly 33% of all designs are using formal property checking. If you go to more advanced designs, those over 80 million gates, you are at 45% or close to 50% penetration into those large designs. If you partition it even farther, the larger the design, the more adoption we see. What is driving this in certain domains is safety and security. 45% of all designs fall under the category of safety in some way, such as having to conform to ISO 26262, DO-254 or others. More importantly, 67% of all designs are implementing some kind of safety feature. Formal is a great technology for these classes of designs.

roundtable2

Bakshi: We use formal. I don’t know if it is a niche or not, but it is growing in significance. We see it being used more for connectivity and for register verification. We want to see more happening. Verification is still used and you cannot get away from UVM. You can’t just press a button and have everything done with formal.

Brinkmann: It depends on the customers. In some areas, it has become deeply embedded. We have some automotive customers where it is an integral part of their IP development process and is relied upon completely for some of their IP blocks. They do complete proofs, and then they are sure that what they are developing is working in the context of all environments and different settings. In other places, it is more of a technology to find bugs.

SE: We usually don’t include equivalence checking when we talk about formal and it is getting to be that we don’t include CDC or or some others. It would appear that it is given a name when it becomes a solved problem and is no longer put in the formal bucket.

Foster: Yes, it is partly a marketing thing. When you say formal, people’s eyes get glassy, so we want to hide the formal name. But quite often there is a formal engine under the hood.

Hardee: This is true for logical equivalence checking, but sequential equivalence checking is more of a property-based technology. It is probably true of CDC as well. Less so for X-prop. There is a distinctly different benefit of formal versus simulation. In formal, if you do X-prop properly, you don’t have the pessimism or optimism problem. You can model it correctly, check it thoroughly in the RTL and the only problem is scalability. Can I reset verify a whole SoC? Maybe that is stretching it a little, but the combination of formal and simulation can solve that problem. Fundamentally, X-prop and formal does not have the issues that exist in simulation.

SE: What emerging formal technologies are almost ready to have a name applied to them?

Singhal: System-level verification. We are doing architectural formal verification these days. We are looking at safety and security and things like cache coherence. A lot of people are interested in proving the absence of system-level deadlocks or how systems perform when context switching. These are hard and large problems that formal has traditionally stayed away from. Normally such a big system cannot be taken in, not only by the formal tools of today but by those of tomorrow. So you have to go back and decompose the problem and apply architectural verification in a way that you can solve it in a piece-meal fashion, then stitch the proofs together to get to the system-level requirements.

Foster: I am seeing a maturing of the industry where it is no longer people just buying a tool. It is being systematic about the way it is being put in place with resources that are dedicated to do this, putting in place test plans, the whole process aspect that has been ignored in the past. This is why we are seeing rapid adoption beyond the applications of pure formal property checking.

Pandey: There are companies that now mandate that every block has to be formally verified. They have dedicated engineers for formal. Without formal sign-off that verification is not considered complete. A lot of methodologies have to be developed to help define how things should be verified at the system level. When you think of things such as connectivity, it sounds like a simple problem, but if you take a design with a few hundred million gates it is not a trivial problem. It is both formal combined with design representation and efficient heuristics. Using formal we can prune things.

Brinkmann: In the area of safety, there is still a lot of work to be done with formal. Fault propagation analysis is another area, and you will hear a lot about this in the future—the ability to identify un-propagatable faults to see if they can be proven or not, so that you can get your diagnostic coverage up to required levels. In function-safety verification you have to meet some coverage criteria, and formal is the only way to prove the absence of some things, so you get a lot of benefit.

Hardee: It is like we are going from one extreme to the other. When you take connectivity, it is a task that you need to use formal for, because it doesn’t make sense to do it in simulation. Whereas, you can use formal to do things that are extremely difficult to do in simulation because to be able to classify faults correctly, according to ISO 26262, you need to be able to absolutely show that a problem can never propagate to an output—or, alternately, it is always checked, that it is always caught by the checkers. That is extremely difficult to do. The best you do with simulation is to show a test case, which shows that it is propagated to an output, or tell me when I don’t yet have a test case or a checker that can pick this up. Formal can answer those questions more absolutely.

Related Stories
Executive Insight: Raik Brinkmann
OneSpin’s CEO looks at why formal verification is suddenly a must-have technology.
Gaps In The Verification Flow
Experts at the Table, part 3: Panelists discuss software verification, SystemC and future technologies that will help verification keep up.
Open Standards For Verification?
Pressure builds to provide common ways to use verification results for analysis and test purposes.



Leave a Reply


(Note: This name will be displayed publicly)