The New Face Of Formal

Experts at the table, part 1: What has caused the increase in adoption rate of formal technology and will that continue.

popularity

Semiconductor Engineering sat down to discuss the recent growth in adoption of formal technologies and tools with Lawrence Loh, product engineering group director at Cadence, Praveen Tiwari, senior manager of R&D in the verification group at Synopsys, Harry Foster, chief scientist at Mentor Graphics, Normando Montecillo, associate technical director at Broadcom and Pranav Ashar, chief technology officer at Real Intent. What follows are excerpts from that conversation.

SE: has seen increasing adoption over the past couple of years. Is this the peak or is there more to come?

Ashar: Research in formal has been happening over decades. The was first proposed in the ’50s. Over the past 10 years the processes in the design and implementation of complex SoCs have stabilized and become better understood. As a result, the places in the process where mistakes are made have become crystalized. This gave formal techniques something to focus on, and it turns out that the places where the failures are commonplace and catastrophic are also places that are amenable to formal techniques. They are narrowly scoped, the failure mode is well defined, and this has enabled formal to become anchored. A great example is clock domain crossing (CDC). An SoC is a sea of interfaces and a typical interface is asynchronous, and this creates a layer of complexity. But the complexity does not grow with the size of the SoC.

roundtable

Foster: Formal applications are the second fastest growing segment of EDA. is the fastest. Formal is seeing a 27% increase in adoption. There are two primary types of formal. The first is formal property checking, where the user has the burden of sitting down and writing the assertions, and the other is narrowly focused problems where we can turn the technology onto the problem and that relieves the burden on the user. The second example is the piece that has changed and is causing the growth. Pure formal property checking is only seeing growth in the 1% to 2% range.

Loh: I would like to add a third item to the list of formal types. In the past customers used to say that they had a lot of verification to do before they even got to formal, but not anymore. There is classical property verification, and while the growth is slower it is one area where people can document the value. The return on investment (ROI) can be calculated. I see two types of apps. One is the category of apps where you can replace things that can be done today, but the formal versions are a lot faster and better. People now have sign-off criteria for these types of apps. An example is connectivity and control status registers, maybe . The third category is the hard problems. CDC can check things that regular structural checking cannot. Another example is low power. This is hard to cover.

Montecillo: One thing that is critical for us is coverage metrics. The opening of the coverage database is very important. To be able to measure where we are and to be able to combine formal coverage into a single metric is extremely important. In the past, there was always an argument that it was difficult to see the value of formal, but now it becomes a lot easier. With model checking, while only a few percent of the people are doing it, it is still a key area that we would like to grow. The ROI of model checking is big. The tool has been evolving and making it much easier for people who have only a little formal background to use. This is a big advance from just five years ago. The debug tool capability has also been growing rapidly. We are also seeing tool convergence between simulation and formal. An example is the waveform viewer that looks similar to those we have used in the past.

Tiwari: We need to look at the horizontals and the verticals. By horizontal I mean that formal has required an expert and was attempting to fulfil a goal. How does that fit into the overall verification flow? What is the confidence? How do we measure that it is playing a part in the overall verification cycle? Consider low power or CDC. Those checks used to be done in simulation, as well, and they have been replaced by formal. It allows you to get to closure and get through the verification cycle faster than other methods. Finding out the coverage metric for each of these and tying it back to the original metrics is very important. When we look at formal it has required a lot of expertise, and this makes it a niche domain. Staffing a project and managing it within an aggressive set of timelines is not easy. Also, how do you sign off on formal? People are looking at mechanisms that consistently achieve a goal in a specified timeframe and allow you to sign off on it.

Ashar: I would like to challenge that a little, in terms of formal being a niche. It used to be, and end-to-end formal is still difficult. But we no longer see formal as a tool. It’s an additional weapon in your armory to solve a problem. We address those problems.

Foster: Security as well. It is all over but with a clear focus.

Ashar: We address the problem using formal, simulation or deep semantic analysis. They can all be used in the right places. We no longer see ourselves as a formal company, but as a verification solutions company. In terms of simulation versus formal, in the old days when medicine was in its infancy, if there was any ailment, the treatment was bloodletting. Simulation is like that. ‘There is something wrong with my chip. I need to simulate it to find out what is happening.’ As we learn more about why things go wrong, the techniques have been more about how to narrow down the problem and find the root cause.

Foster: I agree, but I want to clarify. You said it takes a high level of expertise, and I have never found a successful organization that didn’t grow the required expertise when attempting to use end-to-end formal. It can be acquired, but it is not something you can hand to someone. This is no different than saying to someone, ‘Go do constrained random.’ You have to learn some necessary techniques. The other thing is that we are getting to the point in the industry where we talk about formal and yet many people using these tools don’t even have a clue that formal is happening in there. We are using formal technology to solve specific problems and they don’t even realize that.

SE: The industry has stopped calling something formal whenever it starts to get adopted.

Foster: Our industry did itself a disservice back in the ’90s when we introduced a lot of formal. We said that formal would replace simulation and it was oversold and under-delivered.

Ashar: This happens in a lot of spheres. You oversell in order to get investment. In one of our products we use a QBF (quantified Boolean formula) engine to solve a very hard problem. A few years ago, you wouldn’t think about using that in a product, but things have to come to a point where we can use these algorithms to make things more efficient. It is under the hood and the user does not need to know what is running under the hood.

Tiwari: You raise an interesting point. Waveforms have become very close to what we have traditionally seen in simulation. People understand it and they write specs in terms of them. It is intuitive and built over the years. With formal apps we are building them closer to the ‘normal’ world, and by that I mean it is not a niche. Formal property verification still remains a niche, which requires a significant amount of expertise in terms of design knowledge, in terms of modeling techniques, in terms of abstractions. Modeling of properties and really requires help from the tools before they become successful with signoff. Low power, CDC, connectivity apps – they have made the user’s life simple.

Foster: Solving relevant problems.

Ashar: We have heard from customers that the penetration of apps means that they can focus their simulation efforts on system-level issues. That is an important benefit and provides a separation of concerns. A nice thing about simulation is that it always returns something back to you and provides some insight into the design. Formal, in the past, was used as an end-to-end, push-button solution that is pass/fail. That is a recipe for failure because of the theoretical limits related to how far you can go. But recently, with all of the work that has happened over the past few years, we can provide something even if formal is not complete. When the result is bounded it provides information about coverage, information about if the framework in which you were verifying was over-constrained, and can even provide insight into the design. We are getting to the point where every run of formal will provide something back that is actionable. This makes it as acceptable as simulation. A lot of work has also been invested in finding out what logic was used in a run so that can be reported as coverage.

Foster: There is another thing going on here. We are getting to the point where people think of the technologies as complementary. We use formal to help close coverage in simulation. We use information from simulation to help us get over some of the state explosion problems in formal. They help each other and it is no longer us versus them.

Tiwari: Maybe it happened over the years, but when we talk about simulation, it is inherently incomplete. We never set a target for us to achieve 100% coverage in simulation. There is a practical 100% and a theoretical 100%. With formal we went into a hole where we thought that it is full-formal signoff or nothing. Where is the practical 100% for formal? Coverage is a good example where bounded results enable you to get some amount of coverage, and capturing bounded coverage in terms of actionable stimulus will be an addition to your existing simulation even at the system level. The cross pollination between the two worlds enables you to achieve something even bigger in the verification process at the system level.

Montecillo: That is where we spend a lot of energy today—telling the team that there are a lot of benefits to formal that would enhance what simulation can achieve. We never preach that simulation will be replaced by formal. Whatever formal can do for a group is good. Writing assertions and the way they can be visualized enables counter examples to be found. Even without knowing how to write constraints or assumptions, they can provide some behavioral syntax and get a trace. That is powerful for a designer. It gets their foot in the door with formal, and then they start writing assertions. There has always been a debate whether designers should write assertions. We feel that the designer is the best person to write them, but they need the motivation. With visualization, and being able to see what the assertion does, they have the motivation and the benefits start to snowball. It is not all-or-nothing. Even a small investment in formal returns a huge benefit.



Leave a Reply


(Note: This name will be displayed publicly)