What’s Next For UVM?

The ‘U’ in UVM was meant to be for ‘Universal’ but the notion of universality needs to be updated if it is to stay relevant.

popularity

The infrastructure for much of the chip verification being done today is looking dated and limited in scope. Design has migrated to new methodologies, standards and tools that are being introduced to deal with heterogeneous integration, more customization, and increased complexity.

Verification methodologies started appearing soon after the release of SystemVerilog. Initially they were intended to steer their customers into the subset of the language that they supported. In order to entice them to do that, class libraries provided some pre-built pieces of the testbench. Over time, the class libraries became larger and more complete, and at the same time, vendors managed to support all of the SystemVerilog language. That meant that the industry was left with a multitude of class libraries and associated methodologies – URM, RVM, VMM, AVM, eRM, OVM and lastly, the amalgamations of all of them UVM – the Universal Verification Methodology.

But UVM is far from universal. First, all of the methodologies were conceived when all verification was performed using a simulator. While was beginning to make inroads, it was only seen as being a faster simulator with limitations. All of the other verification engines now in common usage were seen as fringe verification tools, including .

In addition, systems have changed significantly over the past decade. These verification methodologies were designed to verify a block that was expected to be bolted onto a processor sub-system. It was not intended to be a verification tool for the processor sub-system, only for the additional logic. Today, the processor is so entwined with the sub-system, and a system is composed of a multitude of such sub-systems, such that UVM is not equipped to handle the verification task for an entire SoC.

At the other end of the spectrum, mixed-signal design is becoming more prevalent, and the analog portion of the system has been held back using the old languages based on VHDL and Verilog. Only now are they beginning to seriously look at SystemVerilog, and with that comes the problem that none of the verification methodologies had given serious thought about how to verify the analog portions.

It may be time to rethink UVM and that is becoming an active discussion within the community. At the Design Automation Conference this year, Accellera, the standards body that has been the driving force behind SystemVerilog and UVM, held a breakfast panel that asked the question about where UVM would be in five years. Those ideas and user interactions are included here along with thoughts from other industry leaders.

A dose of reality
Sometimes things just get too complex. “SystemVerilog is huge, and so multiple libraries and methodologies emerged to provide guidelines and guardrails on how to use it effectively,” points out Tom Anderson, vice president of marketing for Breker. “The result has been adoption of UVM by many verification teams who previously had been unable or unwilling to embrace SystemVerilog and its advanced technologies.”

Warren Stapleton, senior fellow at AMD, said that we are here because of its success. “We no longer have to have discussions about how people adopt it or what is wrong with it. It is the result of a lot of collaboration, and engineers in companies such as ours have worked at a very deep level to get it to where it is. Now we need to look at how we make it more universal.”

Making UVM more universal will come with a lot of new challenges. “We did a good job of bringing a methodology to a global verification audience that supported fundamental constrained random algorithms,” says Dennis Brophy, director of strategic business development at Mentor Graphics. “It stopped there. We have learned that systems turn into systems of systems and get larger and more complex, and it may be that there is a breaking point at which underpinning all of your verification to constrained random techniques is not going to be sufficient.”

Screen Shot 2016-07-26 at 8.43.14 PM

Brophy points out that UVM has seen the fastest adoption rate of any EDA standard, and took only about five years to reach significant market penetration. Harry Foster, chief verification scientist at Mentor, who manages the Wilson verification studies, says that puts UVM at about 75% adoption, although the trend appears to be a flattening.

It also has had a huge impact on verification in general. “UVM has been a catalyst for raising the level of discussion and has provided the tools to enable good engineers to do verification well,” says Jonathon Bromley, verification consultant at Verilab. “It has encouraged them to raise their game.”

Changing landscape
While verification technologies have been seen as advancing faster than design, verification is still struggling to keep up. “There are some limitations as we start looking across platforms and writing stimulus for post-silicon, emulation, FPGA prototyping, virtual platforms as well as simulation,” says Faris Khundakjie, senior technical lead at Intel and chair of the Accellera (PSWG). “That is where we start to see challenges. Admittedly, UVM was created before these platforms existed and UVM is centered round simulation.

Anderson points to another weakness felt by many in the industry. “UVM defines reusable verification components between projects but provides little help for how to combine testbenches from IP blocks into testbenches for subsystems and systems. A lot of rewriting is required today. UVM defines only how to handle the design’s inputs and outputs. It does not provide any guidance in how to deal with embedded code running on the processors or how to synchronize that code with testbenches.”

Stapleton is in full agreement with this limitation. “While UVM is fantastic for IP level verification and has some built in features that allow it to aggregate it up to larger things, it starts to lose steam when you look at true system-wide verification efforts.”

IEEE
Part of the reason for a slowdown in UVM development is because of one distraction. “We spent six or seven years working within Accellera and finally, as of last year, went to the IEEE level and got IEEE p1800.2 formed,” says Tom Alsop, principal engineer at Intel. “It was a lot of work to get that started. Prior to the contribution of that spec we also spent a lot of time making it more ready for industry adoption.”

The importance of IEEE has to do with international adoption. “An example is Japan where they follow a lot of IEC standards,” says Brophy. “The move to the IEEE will boost its global adoption and support. Further, it will drive stability to promote greater interoperability.”

The challenge for a standards group is people power. “It is the same people that are working to close on the requirements for IEEE standardization who are also working in the Accellera UVM committee to push forward the technology,” says Adam Sherer, group director of marketing at Cadence. “So UVM is the same as other standards where there are a few active individuals that drive the standard and there is a limit to the amount of work that they can do and right now their focus is IEEE.”

As part of the IEEE standardization, compatibility with other standards is also being considered. “We also came in line with existing IEEE standards and looked at IP-XACT, TLM and existing standards to see how we could better align with those efforts,” says Alsop.

Sherer points out other advantages with further standardization. “For Cadence that means consistency in tools and it enables performance optimization around a known library. It helps us to help our customers with education, training, building tools etc. As a product manager there is something that says don’t do anything more, but I know that chips are getting more complex and will involve new languages, new engines, increased amounts of mixed-signal content. These are necessary to move into industries that have not yet adopted this type of methodology including mission critical spaces such as medical, aerospace, automotive.”

An emerging set of needs
Perhaps the big question for the industry is whether UVM should be extended, or if alternative standards should complement UVM. “Over the next few years we will see some recognition that portable stimulus is really about system-level verification and UVM is about block-level verification,” says Mark Glasser, principle verification engineer at Nvidia. “Those two things are going to have to work together in some fashion. SystemC may have some role in there as well especially when people are building transaction-level models.”

It is not just a question of language. “At different stages you are thinking about things very differently and many of these are very different from what UVM was created for,” says Intel’s Khundakjie. “It is undeniable that even in simulation, we need to integrate different models, some in SystemC, and I may want to share a checker that I am using in my firmware with my UVM environment. It is written in C and I don’t want to be told to rewrite it in SystemVerilog. We are in denial about this type of thing.”

Merging of some technologies will be more difficult than others. “We are going to see formal and simulation integrated more closely together, requiring combined models driven from verification plans and planning tools,” says Dave Kelf, vice president of marketing at OneSpin Solutions. “A form of assertions that fits with the UVM use models and can be used to drive formal assertion-based verification will be required in order to create a cohesive verification environment.”

Another drive is to make UVM itself multi-lingual. “We do expect that language use, as we move to bigger and bigger SoCs, will be more diverse than just SystemVerilog,” says Sherer. “When we look at simulation as an example, simulation performance is a balance of the engines that run RTL and the engines that run the testbench. As we look at SoCs and the challenge of speed and capacity that you need, you will get diversity from SystemVerilog. We have increasing amounts of mixed-signal in the design, so how do you simulate that? SystemVerilog could be stimulus for it, but real number modeling would become a factor. How does that fit into a UVM world that is all digital? SystemC is part of our future and there will be more SystemC in design work.”

Then we have the PSWG which is itself building a test and verification environment. How does that interact with UVM? “There is a level of convergence that we will need to deal with,” says Sherer. “The UVM working groups are going to need to work through that. It is a natural part of the future.”

One methodology or many?
At the same time, the user community is getting restless. One member of the audience at the DAC panel said, “We have too many methodologies within the methodology. How do we deal with things like phases and sequences? There are too many ways to do similar things. This confuses people. We are not getting consolidation in methodologies.”

Other audience members were concerned with the process for users in the field. They want to be able to influence the standard, to have more say in the directions of its development, and to be able to contribute without being Accellera members. According to Cliff Cummins, a SystemVerilog and UVM trainer, “some of the panelists alluded to the new ways in which users are using UVM. It been said that the best known methods in UVM don’t exist. We are still making them up. Is there a way for users to introduce new ways to do things to the Accellera committee?”

The principal reaction of the panelists is that users can either talk to their vendors, who will represent them in the discussion, or they could make modifications to the open source implementation, which could be fed back for consideration. However, the preferred way is for them to either become Accellera members or to become an individual technical contributor to the working group.

But not all changes are good. “Making changes for the purpose of making changes is not that productive,” says Glasser. “We will take a look to see if using interfaces makes sense (a new capability introduced into SystemVerilog but not used within UVM). We always have to consider how changes affect compatibility, and sometimes we make a decision based on the penetration of a new feature. “

Verilab’s Bromley weighs in: “Sometimes you have to do something that breaks backward compatibility, and then the tool vendors have to put switches in to cover the backward compatibility. So it is a tough call.”

Bromley also points out that “the UVM committee spent a lot of time deciding what goes into the base class library and what goes into the user guide. There has been less focus on the user guide. This is really a methodology issue. How should the committee handle those types of recommendations? These are modeling issues?”

Another audience member asked for Accellera to take the initiative and create a UVM users group that focuses on the experiences of the users, gathering the best and most promising activities and updates the standard when the time comes.

It is important to have a balance. “Both users and vendors are critical for the development and evolution of effective standards,” points out Breker’s Anderson. “The users understand the problems they are trying to solve and are on guard against solutions proposed just because they might be convenient for the vendors to implement. On the flip side, the vendors serve as a reality check that proposed standards can be implemented, even if (as with SystemVerilog) it is a big undertaking. Because they talk to many different users, vendors can also help look at bigger picture issues that may not be a concern for an individual user.”

Conclusion
Over the next five years UVM could go in many directions, or it could stand still and allow other standards to cover the ground that it was not designed to cover. One thing that is clear is that something is required in the near future, because many users are grappling with its deficiencies today and too many times we have heard that standards are too little, too late when they are finally implemented. At the same time, a rushed standard benefits nobody. The only clear solution is for more people to get involved in the creation of the new generation of verification standards because, whatever it looks like, we will have to put up with it for the next couple of decades.

Related Stories
Verification Grows Up
Experts at the Table. Part 2: Experts discuss reusability problems in verification and the steep learning curve facing new engineers.
UVM, Machine Learning And DFT Come Together
An action packed day at DAC where all extremes of the EDA problem space can be covered under one roof.



Leave a Reply


(Note: This name will be displayed publicly)