Pointing Fingers In Verification

As the verification gap grows due to increased complexity, a panel discussion with verification leaders turns rather raucous.

popularity

With most EDA tools, the buying decision is related to improved quality of results or increased productivity. Will a new synthesis or clock optimization tool enable designers to do more, faster and are those gains worth the price? The equation is fairly simple.

When it comes to verification tools, things are more complex. You can still make productivity gains, or purchase an additional tool that will find more or different errors, but it cannot ensure bug-free silicon. All it can do is reduce the risk of a respin. Can you do too much verification? If it is not the right type of verification, then the efforts are wasted, and there are several indications that the verification process in use today is very wasteful.

Janick Bergeron, a Synopsys fellow, likens verification to an insurance policy: “You hope you never have to use it.”

Bill Grundmann, a Xilinx fellow, sees verification as a career for the paranoid. “You never want to throw anything away that found a bug in the past. There is nothing which says you are done. What is the metric for good enough?”

Jim Caravella, vice president of engineering at NXP Semiconductor, takes a more pragmatic view. “Every chip has bugs, but does it really impact the system or the customer?” Thus a discussion started about who created or is responsible for the often talked about verification gap, and not all see the issue in quite the same way.

The verification gap postulates that verification complexity increases faster than design complexity as the size of the design increases. In addition, it normally indicates that the productivity increases in verification tools and methodologies are not keeping up with the increased demands being placed on them. This means that teams have an increased risk of chip failure unless they continue to pour increasing amounts of resources into closing the gap.

Caravella reminds us that, “with infinite resources and time, it still does not mean your system is bug free.” Grundmann adds that building larger teams can make the problem worse because “with such large teams there is a problem with information handoff between people.”

Mike Stellfox, a Cadence fellow, agrees and broadens the scope to include software whose inclusion is necessary to ensure use cases are adequately covered in verification. “You need early and often communications between the teams.”

Tools or users?
Is the problem associated with the tools? The discussion turned to pointing the finger at the users. Bergeron claims “we have virtual prototypes but the industry does not have the discipline to use them.” Harry Foster, chief scientist for verification at Mentor Graphics, believes that “we rely on superstars too much.” These are the few people within each group that manage to perform magic and avert crisis.

The lack of a single verification tools adds to verification complexity. Stellfox says that “successful companies manage to integrate many flows together.” Bergeron adds that “you have to pick the right tools at the right time. There is no one right tool for all tasks. Coverage is the thing that brings them all together.”

Grundmann does not see tools as the total solution. “People are too willing to jump to a tool rather than do a full assessment of the situation.” Stellfox adds that capturing a plan upfront is the key to being able to successfully adopt tools and that many people use existing tools incorrectly. “Many people adopt UVM and then do directed testing,” he says.

Caravella accuses the industry of artificially creating a gap and then finding a solution to fill it. Other users were quick to lash back at the EDA industry. A Qualcomm representative in the audience asked why there wasn’t a good effort being directed towards SoC verification. Stellfox responded, “UVM was targeted at IP and bottom-up verification. In today’s SoCs there is not a lot of new IP and we need new flows that are optimized around these.”

Bergeron points out that “a standard shows that we are getting maturity. It is too early for SoC.” Qualcomm strikes back claiming that the SoC verification problem has existed for a long time and that EDA companies have been slow to address the need. Foster responds, “we acknowledge the need and we are working on it.”

Mentor has recently made a proposal for a new standardization effort for graph-based verification directed towards SoC verification. Stellfox agrees and says that Cadence has a significant focus on the development of tools, flows and methodologies in this area.

John Swan, an SoC technologist, asked how can we move more verification forward, such as using virtual prototypes. While the panelists said that leading-edge customers are doing this today, they had little insight as to how to move this into the mainstream. Foster believes HLS may be the key, while Bergeron counters that RTL design is not painful enough to make people look at this solution.

Caravella brought a sense of clarity back to the situation saying, “It comes down to ROI. If the gap is not big enough then solutions will not emerge.” Bergeron, still thinking in terms of moving the design abstraction up, says “making a change is a shock to the system. The last time was painful and we had to sacrifice a generation of designers.” He was referring to the migration from the gate level to RTL.

Another Qualcomm representative said “EDA created the gap. We needed a horse and got a camel. How will you prevent it from happening today? UVM was available 10 years ago and you have not made much progress.”

“Yes, there was a view that SystemVerilog was the answer to everything and we spent too much time and effort on the standards effort,” said Stellfox. “But this was driven by customers. Standards provide interoperability but they also stop innovation. Standards make things 100 times slower. We need more innovation in the SoC space rather than a standards play.”

Caravella pilled it on, adding “EDA guys should listen to customers and not their marketing teams. Stop recreating three letter acronyms.”

Will the industry learn from its mistakes or will the development and standardization of solutions that are focused on SoC verification suffer from the same problems? The only way to ensure that requirements are met is for all of the stakeholders to participate in the standards meetings. Without this first step, the industry may again go off course and everyone will pay the price.



11 comments

Chip Design Grunt says:

The solution is simple…as is all solutions to any problem. People inherently create complexity where none existed before, then they point to automation as a fix that never really becomes the magic bullet that it is perceived by many in the industry. Chip design and tools have increased in complexity as the process nodes have gotten smaller and the features and the transistor count have gotten bigger. Yet the overall Methodology and processes of designing a chip hasn’t changed a lot. So, why not go back to the basics and look to see what can be gleaned from there and stop relying on the “Push button build Chip!” mentality. Remember that tools are helpers for you to accomplish tasks, not come up with the solution and implement the fix!

I have found, if you have a sound methodology and a group that understands and commits to the final outcome, then you tend to run into fewer issues.

John Swan says:

I agree with David Black’s comment. Design and code reviews I have been involved with have been very helpful for the team. It takes commitment and some time, but if well implemented the ROI is worth it. I have suggested performing reviews to various design managers I have met with the reaction “A good idea” and something they weren’t doing.

David Black says:

I find it amusing that many of these companies are in such a rush to
create product that they skip some key fundamentals. For instance,
honest thorough design reviews (before, during and after) are frequently
missing or given a very brief nod. This despite evidence that thorough
design reviews often reveal bugs before they even enter the design. I’m
referring to reviews of the specifications as well as honest code
reviews. This problem exists throughout the industry in both hardware
and software teams.

To add to the problem, we accept IP into our designs that itself lacks thoroughness of design reviews.

Hank Walker says:

If one views the hardware description as a program, then the team should have “code” reviews in the same way that software development teams should use code reviews. It has been known for decades that the best way to find bugs is to have someone else read your code, thus the idea of pair programming, as used in Extreme Programming.

Paul Marriott says:

Several important points were missed in this discussion.

1) The Gap is essential to progress – without a gap, we have reached stasis.
2) The size of The Gap is exactly predicated by Moore’s Law. That the rate of Moore’s Law has remained constant for so long is because the inevitable gap has created the progress required to keep Moore’s Law in force. We design tomorrow’s chips on today’s machines. If there wasn’t a gap, then tomorrow’s chips would be the same as todays.
3) Every methodology we’ve used so far has enabled the progress rate we’ve seen.
4) EDA companies are no different than any other in that they have to create fear, uncertainty and doubt in order to sell their products. If we didn’t fear progress stalling, then it probably would. But that fear also creates innovation and maintains the inevitable gap.
5) The panelists all agreed it was impossible to design bug-free chips. Yet everyone of them present had flown on airplanes that had been designed with processes to ensure there are no fatal bugs. Everyone of the panelists trusted those planes. Consumer products have bugs because the lifecycle is so short no one cares. It’s not worth the cost to make them bug free.

Hemendra Talesara says:

Well said, Paul. Gaps make it all worthwhile. This is where we are all adding value. Whether one is creating a new product in the market or just doing verification.

If you don’t live in the gap you don’t live at all.
May the “gap” be with you.

🙂

Brian Bailey says:

I agree with everything you said Paul, but would like to add a different perspective on it, or at least pose the question. If the verification gap were to be narrowed, then would it allow bigger/better chips be produced? The size of the gap may be the current limiter to the system and by coming up with better verification tools we could boost the whole industry. That may then open up other gaps, but we have been talking about the verification gap for so long and nothing seems to be happening to reduce its size.

Size of chips is determined purely by physical yield I think, not complexity of the design itself. As to what drives feature sizes, that, of course, is partly determined by Moore’s Law since the processing required to make the masks (and all the fancy transforms required to reverse the effects of diffraction) are based on current generation compute capabilities. I still think the rate of Moore’s Law is driven by the gap size. I suppose this another formulation of the Strong Anthropic Principle.

I think there’s actually bigger gaps in software since it is still difficult to exploit computational parallelism except for a rather narrow class of problems – RTL simulation being a tough nut to crack as witnessed by the fact that true multi-threaded simulators are not on the market (except in the coarse-grained case of waveform dumping running on one thread, the gui in another and the simulation kernel in yet another).

[…] numbers of engineers filled DVCon this month to learn more about new verification options, with candidly critical comments surfacing in some of them. EUV lithography, which was supposed to be ready at 45nm, has […]

Forget About the Verification Gap | AgileSoC says:

[…] this year was about something called the verification gap. In an article posted last week called Pointing Fingers in Verification, Brian Bailey made it sound like EDA representatives and users were doing their best to defer […]

Pointing Fingers In Verification | The Best Of ... says:

[…] The Verification Gap: Who’s Responsible? A panel discussion with top industry experts gets rather raucous as verification problems continue to mount.  […]

Leave a Reply


(Note: This name will be displayed publicly)