After struggling to find a foothold, a number of technological and business advancements are pushing formal into the mainstream.
By Ann Steffora Mutschler
Formal verification technology, also known as formal property checking, has been in existence since the early 1990s. Still, it’s only in the past five years that it has made big strides in the last five years in terms of the capacity of the technology to handle bigger pieces of a design, leveraging advancements in computing as well as improvements to the algorithms themselves.
These factors, along with new approaches to implementing the technology—hiding some of the assertions from engineers and automating them—have pushed formal verification well into the mainstream. In the past, EDA vendors attempted to force engineers to learn assertions. They now tailor the tools to the engineers’ way of doing things.
“You had to ship a Ph.D with the tool,” recalled Harry Foster, chief scientist for verification at Mentor Graphics and an early developer of formal property checking at Hewlett-Packard. “What gradually happened was we moved from formal property checking requiring a formal expert to these automatic applications, which under the hood was formal. Quite often the user doesn’t even know it’s formal.”
Nowadays, formal applications can automatically extract properties or assertions and then apply the formal engines so they become automatic and pushbutton. That shift has unfolded over the past five years.
Technically speaking, in the context of verification in general, formal verification has three steps: specification, the analysis technology and debug to help process the results and get to the source of the problem.
“When you think about these three steps basically the sum and total of what formal verification is, is part process and part technology—it’s not just about the algorithms,” explained Pranav Ashar, CTO of Real Intent. “There is also a process side to it. For formal verification to be meaningful the verification specification needs to be ready upfront and it needs to be precise.”
The analysis technology—the core algorithms—then take the combined verification specification and design specification and create a model to systematically go through the entire functional space and prove the properties that were written in the specification. The technology is different from simulation in a number of ways in the sense that it is systematic. It keeps track of what it has done before, so the next time it does something it’s in a different part of the functional space. As a result, it systematically tries to cover everything.
It also has the notion of closure, so it knows when it is complete, as well as to cover all the aspects of the design functionality. This is an enormous task, and there are a lot of efficiencies in the algorithm to facilitate the complete analysis.
“The historical challenge has been writing these assertions and these properties upfront in a precise manner,” said Ashar. “It’s always been a challenge because for formal verification and analysis to do something meaningful, the specification has to be precise. Otherwise it is going to give you a noisy result.”
Clearly, this is not a technology for beginners. The customers who are leading in the formal verification deployments all have adopted a strategy of ‘the formal expert team.’ They usually start with one expert and expand the team over time. Their primary job is formal verification.
“They become expert with the tools and the technology and also the application areas,” said Dan Benua, principal corporate applications engineer at Synopsys. “These people are hard to find and they are very focused and specialized. Typically the formal team will partner with a design team or a design verification team. The people who actually write the design generally are not formal experts. They don’t want to become formal experts or they can’t afford to become formal experts because they just don’t have the time. However, the companies that have built these formal teams have managed to justify them on an ROI basis. The people who do this conduct a lot of careful analysis of where the bugs are being found, what techniques are finding the bugs, how much effort is invested versus how much time it takes to reach their verification goals. And they can point to quantifiable benefits that they get from these formal teams.”
For the past 10+ years, the formal verification segment of the EDA industry has been beating their heads against this problem of how to make the market grow and how to get more users beyond these this handful of experts.
“The strategy that everybody has settled on is this app concept where we have specialized use of formal verification that solves a specific problem for the user without forcing them to write assertions or have the expertise on formal algorithms,” said Benua. “In terms of the number of formal users or growth opportunities for the industry, these applications are really where it is at. They are not replacing the formal experts. The formal experts are still there and the formal experts are still trying to figure out how to broaden their utility beyond the number of people they have on their team. The app concept is really what we hope is going to broaden the use of formal. The flip side of it is that the areas it addresses are very narrow. There are small pieces of the verification problem, like conductivity checking, that all the vendors and customers are talking about.”
This approach appears to be working. A large industry study conducted last fall by the Wilson Research Group on functional verification showed a 68% increase in adoption of formal property checking in the past five years. In 2007, only about 19% of the industry claimed to be doing formal property checking. Today that number is about 32%.
“But it’s even better than that,” Mentor’s Foster said. “If you look at designs that are larger than 20 million gates you will find it even higher—about 43% adoption, which makes sense if you think about very large designs. Quite often they have large teams that have a lot of resources, a lot of money, and they will try everything they can to get that thing right.”
Oz Levia, vice president of marketing and business development at Jasper Design Automation, agreed. “What we’re finding is that the app strategy is very successful. There is an increasing number of customers who are choosing to adopt formal solutions piecemeal. They start with conductivity and they adopt X-propagation, etc.”
The other thing the app strategy does is that because the verification question for each application is constrained, a lot more automation can be added. This means the property generation from the spec has been automated, which many feel removes a lot of barriers to the adoption. “This works because it segments the problem and allows us to add automation,” Levia added.
A shift that is happening now, according to Foster, is addressing SoC integration with formal verification technology. “When you get into SoC integration where you have lots of IP, bringing them all together for the first time is where the problem is emerging today. This is an area where these formal applications can be used. One of the challenges when you integrate all these IPs is do you have it connected correctly between the IPs? In the past people used to write a whole boatload of directed tests of simulation to try to verify this and it would take forever. This is a good example where you can apply formal technology…and it can uncover these problems in a matter of minutes or hours compared to weeks.”
Further, he said, SoCs today have a large mix of hardware and software, so there is interest in applying formal techniques to help ensure that software can properly talk to the hardware and various IPs. In terms of the integration issue in the SoC, software is definitely an important area for research because it’s a tougher problem in many respects.
Foster also believes there are also opportunities to apply formal technology to virtual prototyping but there are some technical challenges that need to be overcome first.
“Just like gate-level simulation ran out of gas to be replaced by RTL simulation, we are at a very similar point in RTL simulation now where we are starting to stress it because the designs are so big, they have so much IP. That’s part of the reason you see this huge growth in emulation today—you simulate as much as you can but at some point you’re going to have to go to emulation, particularly if you have any software. And all of these SoCs have software,” Foster added.
Other areas for applying formal verification include security path-verification, low-power verification, the verification of architectural specification, the verification of system level deadlock for complete SoCs, and even linking formal verification to system-level specifications, Levia concluded.
Leave a Reply