Defining Verification

Verification is the process of comparing two models in a systematic manner to locate differences. It always requires two models.

popularity

There was a time when the notion of rigorous verification was seen as being unnecessary and even wasteful. I can remember early in my career working on flight control computers. We did no functional verification and created no models. We prototyped it and ran some engineering tests through it, primarily to structurally verify the system. We did not test the functionality of the system – that was provided by software. The software team did some amount of testing of their code, but soon it went off for flight tests. Not only did they find problems with the system, but this was also the first time that the various sub-systems of the aircraft had been integrated together. We even had to install an illegal reset switch so the test pilots could restart some of the computers before they attempted a landing.

Ironically, the industry cared a lot more about safety as seen through massive amounts of redundancy than they did about functional correctness back then.

This made no sense to me and was one of the reasons why I got into the EDA industry and started to pioneer several aspects of simulation, emulation and acceleration, trace analysis, hardware/software co-verification, and verification languages. Thankfully, the avionics industry has progressed a lot from that time.

Over time, the notion of a verification engineer came into existence. At first, they were seen as being less worthy than design engineers and were relegated to manually creating tests. They were the people who broke things, not the people who created things. This began to change with the introduction of constrained random test-pattern generation. Verification engineers had their own sophisticated languages, they developed methodologies and metrics to track progress, and in many respects advanced the industry more than the design community that remained fixed on their increasingly outdated languages and methodologies.

I have long been a supporter of the Accellera Portable Stimulus language, even though I would have done anything to get its name changed – and I tried. It has the potential to allow verification to make another huge step into the future and to raise the abstraction even further above design languages. It has learned from many of the mistakes of previous languages.

One very important aspect is that it enables incremental development and refinement. This means you don’t have to have a completed model for it to be useful. A development can start off with only including the most important aspects of a system and keep away from anything that might imply a particular implementation. As architectural decisions are made, additional details or capabilities can be added into the model, none of which invalidates anything that was done with the partial model. Brilliant!

It can also be developed top-down or bottom-up or a mixture of the two. Transformational! We now have true verification reuse, not only within a project but across projects. This will dramatically improve the efficiency and effectiveness of the verification process. If it can be improved and the time spent in verification brought back down again, all eyes will go back to improving the design flow – which it badly needs.

To my dismay, in a recent roundtable that I conducted at DVCon, the first part of which is published here, several panelists suggested that this new model should perhaps be the one that would drive the implementation process. That synthesis tools should operate directly from this to produce an implementation. While I do not disagree that it might be possible, I am aghast that many attributes of the language are not targeting design and would thus be wasted.

But what then of verification? If that model becomes the design model, how is it verified? We appear to be back at the beginning where it is inferred that the specification has become so high level, that we can look at it, comprehend it and be certain that there are no errors in it. Clearly time to test fly that model! Even if that is true today, it would enable systems of higher complexity to be considered and we would very soon arrive back at the point where a systematic verification process becomes necessary.

Perhaps I am just getting too old and clutching onto the ways of the past, but to me verification was going in the right direction and still needs more work. I am not sure I am ready to accept that everything can be correct by construction and that specifications never contain bugs.



Leave a Reply


(Note: This name will be displayed publicly)