Formal’s Roadmap

Experts at the Table, part 3: The breadth of adoption of formal, technology breakthroughs and the challenges created by machine learning.

popularity

Formal verification has come a long way in the past five years as it focused on narrow tasks within the verification flow. Semiconductor Engineering sat down to discuss that progress, and the future of formal technologies, with , president and CEO of OneSpin Solutions; Harry Foster, chief verification scientist at Mentor Graphics; Pete Hardee, product management director at Cadence; Manish Pandey, chief architect for new technologies at Synopsys; , CEO of Agnisys; and , president and CEO at Oski Technology. What follows are excerpts of that conversation. Part one can be found here.  Part two is here.

roundtable1

SE: When Mentor started the verification surveys, SystemVerilog and UVM were going great guns even though 80% of the industry was still doing directed tests. We do tend concentrate on the most advanced users. How broad is the adoption of ? It is still just the top users?

Foster: I partitioned the survey data by design size, less than 5M gates, 5M to 80M gates, and greater than 80M gates. It wasn’t a perfect 1/3, 1/3, 1/3 split, but the interesting thing was that the smaller the design, the more likelihood of there being a respin. I filtered out the analog issues. Then I looked at adoption of the various techniques, such as code coverage, and the larger the design the more adoption there is of these techniques. With the really advanced designs, we are at over 50% adoption. Let me talk about some companies that are not doing formal. They are still likely to be using some formal applications. These companies tend to be buying a lot of IP and doing integration. But they are still using formal for connectivity. Companies that are developing internal IP tend to be the ones adoption formal.

Pandey: Every new design that is started today, not just a technology remap, is planning to use formal. They are likely to do property verification at the block level and use other formal apps. There is still some degree of pain associated with doing property verification. But the pain threshold is much higher now, the efficiency is better, the value is higher. People see more value and are willing to be more patient and tolerant.

Singhal: People are getting beaten up these days when they do not use formal, which is a reversal of the past.

Foster: Look at the FPGA space. Only about 20% of FPGAs actually work when delivered to their customers. In certain markets we can’t revalidate without opening the whole box, such as automotive and aerospace, so the cost is tremendous. We are starting to see adoption of formal in FPGA design, which is something I never thought would happen. We are close to 20% adoption of formal property checking in FPGA designs now.

Brinkmann: We see a lot of adoption there for equivalency checking as well. We do sequential checking for synthesis into the FPGA and that is the reason why they buy it. It is not cheap, and earlier people would say it is an FPGA, just rerun it, but that does not work anymore. You cannot open it up and change the FPGA without a tremendous cost.

Foster: When is the problem more painful than the solution? We have crossed that point and the solution has become less painful.

SE: A few years ago there was a breakthrough in formal algorithms that gave everyone a significant performance boost. shows what happens when you can’t keep providing gains. How will formal avoid these issues?

Foster: Narrow applications means that more problems become tractable.

Singhal: If you look at the entire space of computer science including graphics, computer vision, programming languages and databases, the top prize is the Turing Award. This is our equivalent to the Nobel Prize. In the last 20 years, three of the award winners have been in the space of formal verification. The fact that more than 15% of the top people in computer science are working on formal, where less than 0.1% of the programmers are concentrated shows that both the field is very hard but in spite of that the breakthroughs are huge and this has translated into a lot of stuff for the end users. The breakthroughs are so great that they are allowing us to tackle harder problems. The pace of evolution has been dramatic.

Pandey: From the engines point of view, we are endlessly pushing the boundary to provide dramatic improvements. But you want to read in a design, you want to be consistent with simulation, we have to think about all aspects of verification. We have to think about coverage. We need to have a unified view. These are important issues for adoption. You cannot allow formal verification to be an island. It takes a lot of hard engineering work to make sure we can ingest the same design or the design database and that we can debug within the framework. So from engines and interfaces, they have to be continuous in order to make formal successful.

Brinkmann: We are relying a lot on academia to providing the breakthroughs. One thing we must do is provide academia with the resources to do it, we have to support their efforts and make it sexy to work there. We see a lot of computer science professors say that it is not as interesting anymore because EDA is not interesting and not much money to be made there. But if we don’t invest in the basic research – we have to continue making it attractive to bright people. That is the way to keep moving forward.

Pandey: I was reading about artificial intelligence (AI) and in fact formal arose from AI. So we have to foster some of that in academia.

Foster: There is not a single solution that will solve everything. It is a continuum in terms of formal and how it relates to simulation. There are still things I need to verify such as performance that is less effective in formal. It is much better perhaps to use data mining techniques at those levels. I hate to just focus on tools. We have to think about the whole problem. How do we decide what to use for what problem? It matters if I have the right candidates, the correct resources.

Hardee: There is a lot of scope left for formal before it hits further limits. We have made a lot of breakthroughs in terms of capacity. Common compilation and elaboration is vital, and that is probably where the capacity limit is. As far as the engines are concerned, it is fairly easy to parallelize compared with simulation. We have engine orchestration that enables properties to be proven in parallel and this make efficient usage of the compute farm. We have various metrics which show, as far as utilization is concerned, a 2X benefit for formal over simulation. There is plenty of room left to grow there, as well.

SE: Neural networks are seeing a lot of interest these days. These create challenges and opportunities for verification. How do you verify what has been machine learned? Does formal have any opportunities from neural technologies?

Singhal: It is a tremendous area and it has been significantly underserved by commercial tools. The challenge is that as formal gets more adoption you will have to run regressions incrementally as designs change. If you can learn what formal did on the same designs yesterday, last week, two weeks ago—which are 95% the same as the design today—you can apply the results from the past. If you are running exactly the same design, people still run the regressions overnight even if nothing changed. There is so much time wasted and there is so much learning that could be done. IT departments are concerned with the cost to compute and the farms are growing out of hand. If I can manage that better by learning, be it for be formal of simulated designs, and try and predict what needs to be done.

Foster: It is a big data analytics problem, and one of the more interesting areas of EDA.

Brinkmann: How can we benefit from machine learning, and what formal can contribute to the verification of neural networks, are two very difficult questions. It is a research area today. The first question you have to ask is, “What do you want to know about it? What does correct mean?” Automotive is really big into sensor fusion and computer vision on chip and they need these techniques to make things work and so the question comes up, “How do you verify it?”

Bakshi: Isn’t this a software issue and not a hardware issue?

Brinkmann: You have to bring the computing to the sensor, and that is not done in software, it is done in hardware because of power reasons. People take the neural networks onto chips and implement them in silicon. That is how you get really low power and the density you need. They produce so much data that you can’t get it off chip quick enough or analyze it in a data center because of latency and power reasons. So it all happens in the sensor, and this is where it becomes silicon. That means it is an EDA and hardware verification issue.

Hardee: We are seeing the traditional places where formal has been applied, to control logic, to protocols, to data transport but not data transformation. We are pushing back the boundaries and we are seeing formal being useful in data transformation, as well. So as performance has increased we are able to take on more of the problem.

Brinkmann: Formal is quite good in these settings. If you analyze a pipeline filter function using fixed point arithmetic, and then want to know the conditions that would cause an overflow, it is very hard to come up with the vectors to do that, but you can run formal fairly quickly to come up with a sequence of inputs that enable you to explore these corner cases, and that is an area where you can use formal in conjunction with simulation very effectively to navigate that.

Related Stories
Formal’s Roadmap
Part 1: Panelists look at the progress made by formal in the past five years and where we can expect to see new capabilities emerge.
Formal’s Roadmap
Part 2: The need for a formal specification, coverage and progress toward formal verification for analog.
Plugging Holes In Machine Learning
Part 2: Short- and long-term solutions to make sure machines behave as expected.



Leave a Reply


(Note: This name will be displayed publicly)