Systems & Design
SPONSOR BLOG

Getting A Standard Right The First Time

Is the Accellera Portable Stimulus Standard ready to be released? What are the pros and cons?

popularity

The development of standards is a tricky balance, especially when going into areas that are nascent. The Portable Stimulus Standard (PSS), being developed within Accellera is one of those.

This could be the most important standard since Verilog and VHDL. And if there ever was something that deserved the title of disruptive, this is it.

It is the first standard that increases the abstraction of the verification process and at the same time redefines verification correctly. We have existed, and admittedly done quite well, with a verification methodology that was built upon a very shaky infrastructure. It concentrated on stimulus, rather than checking and closure based on observation rather than intent.

PSS defines a model for intent. It is the closest thing we’ve ever had to a requirements document for a system under development. From that model, self-checking testcases can be generated by tools that could target anything from a virtual prototype, through simulation, emulation, FPGA prototypes and even actual silicon. It also brings the notions of hierarchy and reuse into the verification world such that models can be developed and used in larger systems without modification, just like they are in the design flow.

It will bring the equivalent of synthesis to the verification flow. Just think of the impact that RTL synthesis had on the hardware design flow. That is the kind of impact it will have on the verification flow. Constrained random was not synthesis, it was the deployment of monkeys to write Shakespeare, based on guiding models produced by verification engineers. In the early days it increased their productivity and helped to provide better quality, but those gains have been diminishing as design complexity has increased.

But it is much more than that. PSS brings dynamic verification and formal verification a lot closer together than they ever have been in the past. A roundtable at DVCon brought both sides together for some of their first discussions. It will transform areas such as coverage and will have far reaching implications on many aspects of the development flow. Just one example was highlighted in an article that looks at prototyping.

These are areas that the committee has not looked into. They have concentrated on simulation and emulation, the two areas that would have the biggest impact today and would provide them with the fastest return on investment.

It is a reasonable argument that a standard should be developed and released as quickly as possible because all of the vendors are being limited by their tools being based on proprietary standards. The user community wants to see the standard in place before they are willing to invest much time and effort in it.

That creates a dichotomy. Without a standard in place, adoption will be scant and without sufficient adoption, the potential range of tools will not be fully developed. But what if the current incarnation of the standard limits its eventual capabilities? An industry often only gets one chance and after that enough inertia is built in that change becomes twice as difficult.

One user, Mark Glasser, principal engineer for NVIDIA, recently wrote a blog entitled, “Can Portable Stimulus be saved?” In that blog he outlines some of the potential problems he sees that could limit the usefulness of the standard. Other users have voiced some similar concerns. But few companies have experts with the requisite time and knowledge to help guide the development of these standards.

The Accellera committee has wisely put in place an extension of the open review period for the Early Adopter version of the standard. Comments are now due by the end of October, but this still creates a challenge for users. Tools do not exist today to allow them to actually try anything out. The only experiences that exist are based on proprietary languages targeting a couple of use models. This is a tiny fraction of its eventual impact.

The industry needs its luminaries to get involved in this standard and time is running out. If the right standard is not put in place, the whole industry will pay the price for it over the next couple of decades. We don’t get these opportunities too often. Let’s not waste them.



3 comments

Kev says:

I’ve avoided this standard completely, SystemVerilog, Verilog-AMS and VHDL are all “portable” standards for describing hardware related stuff, so it’s hard to see why you would want another.

If people have money to burn on committees they would be better off spending it on getting an open-source version of the HDLs so we can get down to a single universal HDL which is truly portable.

Brian Bailey says:

Hi Kev, Those standards raised the abstraction for describing hardware, but we have never had a language that effectively did the same for the testbench. Verification is becoming increasingly inefficient and something needs to happen to change that. Hopefully this standard will put verification back on the right path.

Kev says:

UVM would seem to cover it, except SV doesn’t do AMS. There’s also another effort at IEEE for analog test – https://ieee-sa.imeetcentral.com/p/ZgAAAAAAjSk_cwAAAAAACOiZ

Design and verification would be a lot more efficient if we ditched RTL and did asynchronous-FSM level, but support for that got blocked in the SV committees, the same as it gets blocked for AMS.

As far as I can tell the guys that do SV are happier if the standards effort is somewhere else that they can ignore. Most of the UPF stuff should have been done under an SV-AMS umbrella, but it didn’t and consequently it’s an ugly tack-on that doesn’t really work properly.

It’s best to have stuff working and drop it on Accellera or the IEEE as a fait-accomplis, otherwise it will probably get screwed up (and be unusable).

Leave a Reply


(Note: This name will be displayed publicly)