A growing divide between the design team, which is stuck using old technology, and rapidly advancing verification technologies are starting to create problems.
One of the great things about attending DVCon, or any other conference for that matter, is the networking. You get to see so many people who are eager to learn, to talk and to share ideas. When this happens, you tend to hear a lot of statements that have to rattle around in your mind for a while before you can start to make sense of them and see if any coherent themes emerge.
By themes, I am not talking about superficial themes such as UVM was hot this year, or Emulation is on a tear. Many of these themes are just an indication of the products that EDA vendors are pushing or a newly released standard that creates attention within the user community. In many cases, it is something users now have to learn and to work out if it has any place in their methodologies.
Some of the ideas tend to be singularities while others, that at first seem to be isolated, tend to form more complex interconnections and structures. One of the trends that struck me this year is the growing divide between design and verification. A decade ago, you would have been right to think that this was about geography. At that time, it was thought that verification could be pushed off-shore to places with cheap and plentiful labor. That proved to a rather disastrous trend that has since reverted to some degree.
The divide I am seeing now is a lot more subtle and complex. The clues started during the formal leaders gathering that Semiconductor Engineering was permitted to attend. You can read about that here. The comment made was, “The designers today are overburdened. The designers design.” Even with all of the IP that is now available—and the percentage of the design that is unique becoming a very small percentage—one would think that the burden had been lifted somewhat on the designers.
While not all designs are the same, the industry is talking about most of the design complexity being in the selection, configuration and integration of IP blocks. There are few tools to aid in this task, but it does not appear to be an area in which the industry is crying out for help, and companies that have tried to tackle this problem have not been highly successful. I will be writing about that topic in a couple of weeks.
At the same time, verification does not get the same benefits from reuse as design does. The verification task has increased significantly, although not as much as indicated by Wally Rhines in his keynote, which just looked at it as an exponential. Their numbers are increasing rapidly for two reasons. First, we have not yet arrived at the point where IP can be trusted. That means that a certain amount of re-verification has to happen for each and every IP block. And second, there is an increasing amount of verification that has to be performed at the system level. This increases exponentially with the amount of shared state space, not the total amount of state space. State that is hidden in an IP block has no impact on the system level and does not contribute to total system-level verification complexity.
The second major statement came from a roundtable, the first part of which is published here. Arturo Salz, scientist within the verification group of Synopsys, said that within verification we have never dropped a technique. “[Verification] keeps moving and it moves much faster that design technologies. They are still stuck using RTL.” The rate of change in adoption of technologies is very different between the two communities.
That creates a problem because efficient verification often requires an understanding of the design. Knowing about the pipelines and the design architecture can make it a lot easier to target the corner case bugs that may be lurking in the design. Several experts from the formal gathering indicated that designers are less able to help because they cannot master the new technologies being adopted by the verification team.
Assertions are one example. It was stated that verification engineers are three times more productive in the creation of properties and assertions. Another said that when designers try to do this, only a select few of them are successful.
This is a problem, and if the technological divide between design and verification continues to grow, we might as well spread the teams apart geographically. If the two teams cannot effectively talk, then both teams will ultimately suffer.
There is at least one attempt I know of to remedy this. A few people are calling for the designer and the verification engineer to sit together and work on the problems as a team (Buddy Programming). In this way, the verification engineer will be intimately involved in the design and may even be able to prevent design decisions being made that make the task of verification more difficult. However, I have seen few people outside of the software industry who have attempted this.