Hybrid Verification: The Only Way Forward

Experts at the table, part 4: Creating verification environments; what will change in 2015; gauging optimism for the future.

popularity

Semiconductor Engineering sat down to discuss the state of the industry for functional verification. The inability of RTL simulation to keep up with verification needs is causing rapid change in the industry. Taking part in the discussion are Harry Foster, chief scientist at Mentor Graphics; Janick Bergeron, fellow at Synopsys; Ken Knowlson, principal engineer at Intel; Bernard Murphy, chief technology officer at Atrenta and Mike Stellfox, fellow at Cadence. In Part One, the panelists discussed the industry challenges and the emerging verification platforms. Part Two discussed some of the changing needs for verification and that a paradigm shift is required. Part Three examined new verification needs in the area of security and the important of randomization. What follows are excerpts of that conversation.

roundtablepic

SE: Is the creation of verification environments for Virtual Prototype going to be more or less difficult than for RTL?

Bergeron: Getting people to step back and create the necessary level of freedom so that the tool can do exploration is difficult. Even to this day, some people have a hard time doing proper constrained random Verification. Some people are just switching from directed testing even today in 2014.

Knowlson: You can use these for some of your security Coverage because you can look at how it is supposed to be used and then you can analyze an attack in certain areas. It doesn’t help you for something completely different.

Murphy: Yes, it is a starting point. But you need some kind of Formal Verification because when you look at sneak paths from secure zones into non-secure zones, these are not easy to exercise. If you compound that by saying it is not a sneak path but a timing side channel attack, it gets really funky. You are not going to look for a path of some kind, it requires long convoluted statistical analysis. Those are difficult to find.

SE: What is going to change over the course of the next year?

Foster: We have observed a shift towards formal in terms of acceptance by the industry. In fact it is the second fastest growing segment. If we remove equivalence checking, it is growing at a compound growth rate of about 32%. Emulation is the only segment higher at 40%. A lot of this is being driven by the adoption of formal apps and we will see a continuance of that because this makes it easy and eliminates the need for an expert. We will continue seeing more recognition that we need constrained random when developing new Intellectual Property but that is not going to scale to the next level. We need new solutions at the SoC and integration level. Emulation is also become more cost effective because it is now a data service.

Bergeron: One year is a very short cycle for a methodology change in our industry. There will not be any significant changes. Things evolve over time.

Knowlson: I see this emulation problem as a scale problem and we are heavily into centralized emulation farms, but even that isn’t enough. I would like to see the ability to do some validation or verification on a faster, cheaper platform that can handle the cluster level. This contains multiple sub-system IPs. It is cost effective to look in that direction and offload.

Murphy: In terms of mass trends, yes things move too slowly. I think there may be some hints for metrics for coverage for integration and security. These will be hints, not standards.

Stellfox: We have seen a very fast trend towards Software-Driven Verification and also the trend of looking at verification as not just at the IP level. Most customers are organized as silos, where the virtual platform guys do everything involving platforms and they never talk to the emulation team and the emulation team is somewhat closed. I see the need for a convergence where people recognize that they can leverage tests and modeling environments from each other and are thus able to combine them in more effective ways. At the heart of that is software-driven verification and an integration flow that is starting to evolve. By next year I think we will be hearing more about hybrid emulation and FPGA combinations.

Foster: There will never be a substitute for thinking when we talk about verification.

SE: Are you an optimist or pessimist when it comes to verification?

Foster: In the late 80’s everyone one was saying: how are we going to do this? Gate-level simulation isn’t scaling…

Bergeron: This industry is getting old. Where are the new people? Where are the new companies? Young people are not interested.

Foster: This is a serious problem. We have a hard time, and this is true of many formal companies, hanging onto the top guys because Google wants them. Not for formal, but these are bright algorithm guys and they are in demand.