Verification Grows Up

Experts at the Table. Part 2: Experts discuss reusability problems in verification and the steep learning curve facing new engineers.

popularity

Semiconductor Engineering sat down with a group of verification experts to see how much progress has been made in solving issues associated with the profession. Panelists included Mike Baird, President of Willamette HDL; Jin Zhang, VP marketing and customer relations for Oski Technology and Lauro Rizzatti, a marketing consultant and previously the general manager of EVE-USA. In part one, the experts talked about the changes seen so far and the changes happening in the verification process. What follows are excerpts of that conversation.

SE: It has been said that you only have to change 15% or 20% of the design and you have to redo all of the verification. This implies that verification is not reuseable.

Rizzatti: Time to market has always been an issue and missing the market by two or three months means that you have lost a lot of revenue. Up to 70% of the design cycle is taken by verification and you have to do the job.

Baird: We are all using the same technology and one of the differentiators is getting to market faster. Reuse has been a big issue. You have things like UVM which came with the big promise of reuse but it really hasn’t lived up to it. We keep trying methodology as an answer, but there is a price you have to pay to have all of the reuse capability in verification and most people are not willing to pay that for their first time to market. Even if they do, there is a turnover in engineers and most of the knowledge is lost. Companies that take the time to build block level reuse into their testbench still question the value if there is turnover because the new engineers may not understand the legacy. Practical matters keep shooting it down.

Rizzatti: Is UVM adopted today and in mainstream? At DVCon India, for the 10th anniversary of UVM, there were several people who made a statement that UVM is mainstream. I wonder?

Baird: It has been the flavor of the month for a while and the first wave of adoption has happened. But you are dealing with an object-oriented software development environment and getting beyond that first wave is more problematic. Without reuse you do not get the full benefits from it. So while people have adopted it and are using it, I see it being used in a simplistic way. You have something with which you can build everything from a mansion to a shack. There are many more shacks than mansions. You don’t have to be at that level to get the full benefits but you need to be beyond the shack. It is a complex problem.

Zhang: We are perhaps biased because all of our customers are doing leading edge development. One of the reasons they do is because of reuse. A formal verification testbench contains a lot of properties and assertions. The next generation may have some changes, but the functionality probably does not change for many of them. You may only have to change a single assertion. This is a lot less effort for maintenance. Formal also allows verification to start early and that gives them a better chance of hitting their schedule.

Baird: So what is holding people back from doing formal?

Zhang: It is the learning. It takes two or three years of learning to become an expert.

Baird: That is a long and steep learning curve.

SE: What is the learning curve for UVM?

Baird: It depends upon your background but for a design engineer that is moving into verification, there is object-oriented software development which is a learning curve independent of what you are doing. The other thing is that UVM and some of the choices that were made as it grew are complex. They did not take the simple approach. To be powerful you have to have flexibility, but the cost of that is complexity. This relationship always holds. It you want it to be simpler then it is less flexible and thus less capable. SystemVerilog is of middling complexity and the UVM class library went for full flexibility and thus the complexity. You can get up the curve quicker if you are not concerned with reuse. To get enough experience, you really have to go through one or two whole projects and that takes a couple of years.

SE: It sounds as if formal is no longer the more difficult technology?

Zhang: The notions of complexity with formal are based on the past when the technology was not mature enough and the methodology was not there. Today, it is still complex, but is something that can be learned. It is no more learning than UVM. You just have to be willing to work hard and be patient.

Baird: Yes, but there is still the perception that it is some kind of black magic. is something that has its roots in the 80s and we know how to do that…

Zhang: We fight that perception all the time.

SE: How does UVM fit with ?

Rizzatti: At DVCon there were three or four papers that talked about UVM for emulation. According to one of those presentations, it is possible to build a UVM environment that will serve both simulation without compromise and can also run in emulation without change.

Baird: I have been involved with Mentor Graphics and their emulation effort with UVM. They have the notion of emulation friendly and emulation ready. Ready means I have all of the synthesizable stuff rounded up and into the emulator and then the rest runs very abstractly in the simulator. Friendly means that the methodology can accommodate it but I may have to tweak a few things. We teach people to always start with things that are emulation friendly even if they are not sure they will use emulation.

SE: Are tools being designed to help with the creation of UVM testbenches?

Baird: Yes. A lot of UVM is a framework and then there are lots of pieces that sit inside that framework. What can be automated is the creation of the framework and tools are being developed for this. They are layering stuff on top that allows you to specify what you need and then they will connect the necessary pieces together for you. Then you have to go in manually and put your behavior in there.

Zhang: In the formal space, there have been companies that tried to synthesize assertions by looking at the simulation database and writing out assertions. I don’t think they are being very successful. There is only so much a tool can do and there is intelligence that is required and cannot be made push button. People hope with formal that you can just push the button but many times it doesn’t finish and you have to do analysis.

SE: What is the next big thing for verification?

Zhang: Some people think that formal will become the default. There are two types of formal: property checking and model checking. We have had success with property checking and I think we will get there with model checking in the next ten years. Formal is being used more for block level verification and this is beginning to push simulation up to the sub-system level.

Rizzatti: If formal is pushing simulation upward to take care of bigger blocks, then emulation comes in so simulation gets squeezed even more.

Baird: We tend to think of simulation as RTL, but we have had paradigm shifts from transistors to gate and then to RTL. There have been attempts to shift it higher but these have not been successful in the broader sense. We have and that takes care of the DSP space but it has not spread broadly. Even emulation will not push simulation out because you can’t put abstract descriptions into an emulator. We have been trying to find the next abstraction for 15 years now.



2 comments

Tudor Timi says:

What’s the difference between model checking and property checking? The Wikipedia article says that they’re the same thing: https://en.wikipedia.org/wiki/Model_checking

Brian Bailey says:

I will check with Ms Zhang. It is possible that she meant property checking and theorem proving, but some people also distinguish between liveness and safety checking. If the latter, then a better explanation of the differences would have been beneficial.

Leave a Reply


(Note: This name will be displayed publicly)