Verification Facing Unique Inflection Point

Joining the dots between DVCon’s keynote by Wally Rhines, a conversation with Accellera’s Shishpal Rawat and a Cadence lunch panel.

popularity

The Design and Verification Conference and Exhibition (DVCon) attracted more than 1,100 people to San Jose last week, just slightly less than last year. While a lot of focus, and most of the glory, goes to design within semiconductor companies, it is verification where most of the advancements are happening and thus the bigger focus for DVCon. The rate of change in verification and the productivity improvements required are increasing at a faster pace than ever before.

Design productivity has seen a huge boost from reuse. This enables large blocks of functionality to be incorporated into a design with little understanding being required about what that block contains. The number of blocks in a design continues to grow and the proportion of the chip surface area filled by IP is increasing. The design challenge often becomes the selection, configuration and integration of IP with a small quantity of new and unique content that provides differentiation.

There is only one problem with this methodology. Verification does not get all of the same benefits from reuse. While it should not be necessary to re-verify the IP blocks, those IP blocks add state. In many cases they add shared state, and this means that the verification problem gets larger. To add to the complexity, many of the new verification problems do not directly relate to functionality, but include aspects such as performance, power, safety and security – none of which can be verified at the block level. It is the increase in system-level verification challenges that is driving the industry to come up with increasing numbers of technical solutions that can be deployed, and this was the focus of DVCon 2016.

Keynote sets the context
, chairman and CEO of Mentor Graphics gave the keynote this year, titled Design Verification Challenges: Past, Present and Future. (If you are interested in the slides from this presentation, they can be downloaded here.) Rhines discussed design productivity growth. He said there would be no end to the decrease in cost per transistor and that EDA would continue to track the growth of the larger semiconductor industry.

“If we look at the growth of design engineers, it has been about 3.7% per year compounded in recent years,” he said. “What has seen rapid growth is the growth of people who identify as verification engineers, which has grown almost 13%.”

This should start everyone’s alarms bells ringing as an indicator that something is very wrong with the verification methodology in place today.

Verification Facing Unique Inflection Point fig1

Rhines discussed the reasons why this growth had occurred, primarily focusing on the increasing state space of designs. He then dived into the history of the verification industry starting in 1973 with the introduction of CANCER and SPICE. The next phase of development started in the early 1980s when language-driven design started to emerge.

Rhines talked about the verification methodologies that grew out of this, but this is basically where all of the verification productivity problems started. SystemVerilog and UVM have become the methods of choice for the industry and they have brought about above average growth for verification tools revenues. This is unsustainable given Rhines’ previous comments.

As we get into the system-era, Rhines noted that it brings in a variety of other issues. “We have gone beyond functional verification into other domains. Clock domains, performance, power and more recently safety, security and the usage of application software as the test stimulus for verification.”

“At the top level, the greatest opportunities exist,” he said, as he looked at some of the technical advances that have been made in verification engines. As an example, he discussed power and the small improvements that are possible after you have a layout. That compares to an order of magnitude improvement when changes are considered at the system level.

The last part of the talk focused on the future and the way in which verification silos will come together. “What we need is an environment where the verification process is extracted from the underlying verification engines,” said Rhines. “That is, you separate the what from the how.” He introduced the work going on within the Portable Stimulus Working Group (PSWG) of Accellera.

Standards progress
Within most industries, standards develop to address issues such as safety or security. They are developed when the need arises, usually because something bad happened one too many times. But within EDA, and in some technology areas such as the development of interfaces, a new technology will not fully develop until the standards are in place. Standards are thus developed within the committees. Most of the standards for EDA are developed within Accellera, and when they mature they are handed over to the IEEE for further ratification and continued development. Accellera continues to make the standards it originated available for no cost to help ensure quick proliferation.

Verification Facing Unique Inflection Point fig2

IEEE 1666SystemC, IEEE 1685IP-XACT, IEEE 1800 – SystemVerilog/UVM, IEEE 1801UPF

Shishpal Rawat is the chairman of the board at Accellera as well as holding down a full-time job at Intel. “Accellera’s roadmap is interesting,” says Rawat. “Usually customers want a particular standard, and so requirements are gathered and then something is put in place. We start with a proposed working group that introduces the need for a standard in a particular area. Once it gets board approval a working group is developed. That is what we did with the PSWG.”

The name of this standard is deceptive. The standard is likely to define a way to express the possible dataflow paths through a chip, such that application-level scenarios can be defined. Tools can then work on this description to generate software that runs on the processors within the chip to verify functionality. Because it generates software, that makes it portable. So the same “test” could run in any execution engine, including a virtual prototyping, simulator, emulator, or even the final manufactured chip. But what is generated is much more than just stimulus. In addition, the dataflow graph implicitly contains a model.

“There is a lot of energy going into Portable Stimulus,” says Rawat. “We want to have a mechanism to generate stimulus in a standard way, so that no matter what stage of the design you are in these generators can get you into the tool of your choice. It is portability across tools and across abstractions. It has become a massive effort that they are trying to manage by phasing it.”

Verification Facing Unique Inflection Point fig3

The goal is to have decided on the approach by May 2016 and a first draft by the beginning of 2017.

Industry discussion
With multiple proposals on the table for consideration as the basis for the first draft, the industry is finding ways to promote the approaches so that the industry can weigh in with their views. One such event was a lunch panel sponsored by Cadence. The panel was moderated by Frank Schirrmeister, group director for product marketing at Cadence and the panelists were Sharon Rosenberg, senior solutions architect at Cadence, Tom Fitzpatrick, vice chair of the Accellera Portable Stimulus Specification Working Group and a verification technologist at Mentor Graphics, Alex Starr, an AMD fellow and Karthick Gururaj, principal architect at Vayavya.

Schirrmeister talked about the amount of reuse that should be possible in a verification flow and the increasing levels of interaction happening between various components of the system including the many layers of software. “There are three levels of reuse—horizontal reuse across execution engines, vertical reuse from IP through sub-system to chip, and then across use cases.” During the course of the panel, Schirrmeister was made aware of a fourth axis of reuse, which is to port use-cases across various implementation architectures such as the migration from a single core to a multi-core architecture.

Mentor and Cadence are combining their experience in this area to define one of the options that Accellera is considering. The other option is coming from Breker. Fitzpatrick noted that “there are a number of solutions being created in the industry and everyone has their own way of dealing with the problem. Standardization can help.”

Starr noted that coverage is important. “This is not traditional functional coverage. It is more about system-level coverage. We do a lot of system-level workloads, but there are challenges such as production software only becoming available toward the end of the schedule. We need a solution where we can get close-to-the-metal and test out all of the interactions between hardware and software early in the cycle.”

Schirrmeister prefers calling it software-driven verification. This is a significantly better description, although it too also falls short of the value that such a specification can provide.

“When you have processors in your system, you may as well take advantage of them and use them to exercise the system,” says Fitzpatrick. “Having a way where everyone can talk about the same thing and specifying it in a way that can be re-used and shared is critical.”

Conclusion
This is the first time that a verification solution actually has been engineered from the ground up. It is intended to solve the growing problems of system-level verification. Solutions in the past have been Band-Aids placed on top of Band-Aids, and those solutions have wasted a lot of time and money.

If you care about bringing verification time and costs under control, then this is an effort that you must get involved with. Look past the portable stimulus name, and think of this as the foundation on which the next 20 years of verification will be based. That is how important it is that industry gets involved now. It is like elections—don’t complain about the results if you didn’t vote.

Related Stories
Transistor-Level Verification Returns
Verification Grows Up
Debug Becomes A Bigger Problem
Getting Formal About Debug



Leave a Reply


(Note: This name will be displayed publicly)