Verification In Crisis

How can formal verification help?


Why is it still so hard to ensure good quality sign-off happens without leaving behind bugs in silicon? The answer, according to my colleagues at DVCon, is highly nuanced. The industry has been improving overall, as has the complexity of designs.

For ASICs, 74% of the designs surveyed in the recent Wilson Research Group/Siemens EDA report have one or more processor cores, 52% have two or more cores, and 15% have eight or more processor cores—something we see more of in our experience of deploying formal verification. What was remarkable was that 32% of the ASICs had AI cores while 30% had RISC-V processors, with 58% building safety-critical chips, and a staggering 60%+ are designing chips in ASICs that require ISO 26262 compliance. Automotive has been a fast-growing sector for ASICs. Security verification has also seen a big jump, with 71% of FPGA and 75% of ASIC projects looking at security and safety concurrently. 72% of ASICs are reported to manage power actively, and power management verification has become an important challenge.

The report shows that 66% of the ASIC and 70% of the FPGA-based projects are behind schedule. A staggering 76% of ASICs require two or more re-spins. At the same time, the ratio of verification to design engineers has become 1:1. Meanwhile, a previous report from 2020 indicated that 68% of FPGA projects were behind schedule, while for processor design houses, the ratio of verification to designer head count is 5:1. Compared to 2007, there has been a 145% increase in verification engineers, while design engineers increased 50%. So, despite a 3X increase in verification engineers compared to designers, why are 66% of ASIC and 70% of FPGA projects behind schedule? And why do 76% of ASICs require two or more respins?

Only 13% of the ASIC designs with 10M to 1B gates succeed the first time, which is anything you call a laptop, PC, server, AI chip, or smartwatch. The most staggering finding is that the largest cause of ASIC respins is logic/functional flaws. 62% of these flaws cause respins. And if formal methods were adopted on time with the correct methodology, this could have been entirely avoided.

Verification adoption trends

It is no secret that UVM and directed testing dominate the verification landscape.

While assertion usage has increased for dynamic simulation, assertion-based formal verification still needs to be mainstream. The lack of widespread formal verification adoption is a challenge we must address, and the report statistics should be an eye-opener for all of us.

Verification beyond doubt using formal methods

We need to find corner case bugs earlier, prove they don’t exist, and ensure that silicon in the planes and our cars is safe and secure. We cannot afford an Ariane 5 explosion, a GoFetch, a Meltdown-type security scenario, or an FDIV, as much more silicon is used nowadays in our lives.

Formal verification is the only way to build a proof of bug absence. I believe that formal is becoming mainstream now, but I’ll turn to the Wilson survey to support this. I’m glad that Siemens EDA has continued sponsoring this study, initially started by Mentor Graphics, and that Harry Foster once again has posted an excellent series of blog posts summarizing the key results.

The finding of most interest to me is that more than a third of ASIC projects now use formal property checking as part of their verification flow (see figure 1). What needs to be captured is the impact of this adoption on bug hunting and bug avoidance. It would be interesting to find out whether customers would be willing to disclose this in a more open survey.

Fig. 1: Formal verification adoption on ASICs.

This may sound like a small percentage, but many relatively small ASIC designs are still being done in older process technology nodes.

One trend clear from the study is that the adoption of all advanced verification techniques increases with design size. About half of all ASIC projects larger than 10M gates use property checking. Formal achieving half the adoption of ubiquitous simulation is a significant accomplishment!

Adopting automatic formal techniques is a bit less than property checking. There are several important applications, including clock/reset domain crossings (CDCs/RDCs) and connectivity checking, where users don’t have to write any assertions to get the value of formal proof. Some tools categorize these checks as advanced linting, even though there’s a formal engine under the hood, so it may just be a matter of how users report.

Property checking requires writing properties, so I was pleased to see that almost 70% of ASIC projects (see figure 2) have adopted assertions, with SystemVerilog Assertions (SVA) being the most widely used format. The study shows more projects using assertions than constrained-random stimulus generation, which lies at the heart of the Universal Verification Methodology (UVM) and every other modern testbench approach.

Fig. 2: SVA adoption in dynamic simulation for ASICs.

Since only half of the projects using assertions also use formal verification, many assertions are still running only in simulation. While only some assertions are suited for formal analysis, these design and verification teams have already done the hard part of translating design intent into assertions. They should give formal property checking a try.

FPGA developers must catch up to their ASIC counterparts in adopting all advanced verification methods, including formal techniques. Historically, FPGA designs have generally been smaller and less complex than ASICs. FPGAs can be reprogrammed in the bring-up lab and even in the field when bugs are found. This has generally meant less emphasis on pre-silicon verification and a greater reliance on lab testing.

Fig. 3: Adoption of dynamic simulation techniques in FPGA.

This is changing dramatically. The largest FPGAs can be system-on-chip (SoC) designs with multiple processors and specialized accelerators. Finding the source of a bug in the bring-up lab can take days or even weeks, with long reprogramming loops to try out possible fixes. The result is that FPGA teams increasingly use the same verification tools and methodologies as ASIC developers.

The Wilson study shows that more than 40% of FPGA projects have adopted assertions (see figure 3), and, as with ASICs, assertions are used more than constrained-random testbenches. More than 15% of FPGA projects use automatic formal tools (see figure 4), and almost 25% use formal property checking. Interestingly, the ratio of assertions to formal is smaller for FPGAs than for ASICs, meaning less use of simulation-only assertions.

Fig. 4: Adoption of formal techniques in FPGA.

The driver for the increased use of formal verification is clear: getting chips right on the first silicon is always getting harder. The study reports that 76% of ASICs require one or more respins (re-fabrications) before they are ready for production use.

Every spin adds millions of dollars to the project cost and months to the schedule, and that’s not even counting the tens or hundreds of millions of dollars in lost revenue if the end product is late to market compared to the competition.

The study also finds that 84% of FPGA designs have non-trivial bugs that escape into production usage. Reprogramming or replacing these devices in the field is a very expensive undertaking, if possible. More than half of all FPGA designs in production have logic or functional bugs, all of which would be caught by better application of formal verification. Similarly, more than half of ASIC respins are due to logic or functional bugs.

Formal verification: A growing necessity

Having watched the formal market so closely for so long, the increased attention paid to this technology is another metric. All three major EDA vendors have successful formal products and active R&D efforts to support growing users with increasing needs and requirements. At Axiomise, there is a steady high demand for formal experts to supplement internal teams and establish methodologies and best practices.

I’ve had the pleasure of travelling around the world to discuss formal methodologies and help users succeed with commercial tools. I’ve seen first-hand how powerful formal techniques can be and how excited users get when they catch bugs that would have been just about impossible to find otherwise. When they achieve proof of design correctness, they have the confidence to tape out without worrying about re-spins due to functional bugs.

For all these reasons, it’s exciting to see how far formal verification has come regarding capacity and usability and how much adoption has grown. If you haven’t yet tried formal verification, I urge you to do so. Automatic formal tools are push-button after minimal setup, and if you already have assertions for simulation, it’s easy to try them in a property-checking tool. I predict that you’ll be pleased with the results and that you’ll report your own experience in the next Wilson study. Overall, I believe the crisis in verification would certainly be mitigated in many cases if formal verification would be adopted early and consistently in the full design verification flow.

Leave a Reply

(Note: This name will be displayed publicly)