Adapting Formal

Formal technology has long been accepted as a powerful verification tool. It’s moving beyond that now.

popularity

With more pressure to make designs efficient — from a power perspective, as well as from an overall design view — finding what can be removed from a design is one step closer. As discussed in the article published today, “What Can Be Cut From A Design?” — sequential analysis, based on formal verification technology, is gaining traction.

I specifically asked Mentor Graphics’ director of engineering, Abhishek Ranjan, how the formal verification technology was adapted for sequential analysis.

He explained that initially, it was Calypto that started with sequential formal verification. [Mentor Graphics acquired the company in 2015.] “The company was built on the foundation of being able to formally verify the functional changes. Prior to this, most of the equivalence checkers/formal checkers were just combinational — they would treat the flop boundaries as sacrosanct and would verify the changes made within the flop boundaries.” He believes Calypto was the only company with technology to verify the sequential changes like re-timing, re-pipelining, sequential clock gating, and flop cloning.

In the course of business, Ranjan explained that when designs came in from customers for verification, Mentor’s verification team started seeing that a lot of changes being made were for power, and design teams were trying to reason out across the flop boundaries to figure out cases in which the additional gating conditions on the flops or the memories could be put to reduce power — and they wanted verification support for that. And while the tool was capable of verifying those changes, there wasn’t an automated process in place.

“We figured that since a lot of people were doing it, it made sense to automate the process, and the idea of automating this led to the sequential analysis: the discovery that the way you formally verify it can also be used to generate these gating conditions. Then, we went into a discovery path for sequential analysis where redundant activity could be analyzed,” he said.

The real clincher, Ranjan added, is that as soon as you figure out that one type of redundancy can be analyzed, the scope of applying it for various other kinds of redundancies becomes unlimited because if you can reason out for any redudant toggle — whether it is happening on clock, flop, memory, datapath — you can pretty much figure it out for any other design object.

Related Stories
Gaps In The Verification Flow (Part 3)
Panelists discuss software verification, SystemC and future technologies that will help verification keep up.
Choosing Verification Engines
What tool works best for a specific verification task may be clearer in the marketing literature than in the real world.
Can Verification Meet In The Middle? (Part 2)
The industry has long considered verification to be a bottom-up process, but there is now a huge push to develop standards for top-down verification. Will they meet comfortably in the middle?



2 comments

DaveKelf says:

Hi Ann. Interesting discussion, but I feel compelled to point out an inaccuracy in the story. The use of sequential checking in Equivalency Checking has been around for a while, well before Calypto. OneSpin has had this technology for over ten years now and has used it for all kinds of Equivalence Checking applications. In fairness, the Cadence Jasper guys and even Synopsys has also had it for a long time. OneSpin also has a SystemC front-end to its tools as well. Mentor, I believe, was last to market with this, hardly the first! In these days of Trump I think its important to get the facts right! Thanks, Dave

Ann Steffora Mutschler says:

You are absolutely right, Dave. I hadn’t intended to make it sounds like Calypto was the only one! Thank you!

Leave a Reply


(Note: This name will be displayed publicly)