Open Standards For Verification?

Pressure builds to provide common ways to use verification results for analysis and test purposes.

popularity

The increasing use of verification data for analyzing and testing complex designs is raising the stakes for more standardized or interoperable database formats.

While interoperability between databases in chip design is not a new idea, it has a renewed sense of urgency. It takes more time and money to verify increasingly complex chips, and more of that data needs to be used earlier in the flow by a range of multi-vendor tools. But given the range of ratified and proposed industry standards such as UVM, IP-XACT, SystemRDL, Portable Stimulus, Design Debug API, there may be a way to leverage existing technology to resolve some of the verification community’s biggest headaches.

Frank Schirrmeister, senior group director for product management in the System and Verification Group at Cadence, said that an OpenAccess type of format in verification increasingly makes sense for such areas as assembly, debug and coverage. The question now is what pieces should be interoperable, and how much effort will be required to make that happen.

Assembly
In the assembly area, the IP-XACT standard is used for hardware. In addition, there is a need to streamline register definitions for hardware/software interfaces. Companies such as Agnisys, Semifore, and Vayavya Labs all deal with register definitions for the purpose of enabling software, and exposing information to the software side of the world.

“It is true that more standardization in verification data exchange is important,” stressed Anupam Bakshi, CEO of Agnisys. Further, he noted that UVM has helped the industry immensely in increasing the productivity of the verification teams. “Within UVM, there is the register layer that has streamlined how memory mapped registers are verified. Yet there are different dialects of UVM being used, each promoted by one of the major EDA providers. The problem with standards is the balancing act between fixing certain ways of doing things without stifling creativity and innovation that helps the new EDA companies grow. We would like to see a standard that will help automation of verification environments. There could be one for VIP APIs so that customers can plug and play the VIPs from a variety of vendors.”

The Accellera Unified Coverage Interoperability Standard is being used today to exchange coverage data, and Bakshi observed that it is proving to be useful.

Debug data
The most contested area today is debug, which is understandable considering it is one of the most lucrative areas for EDA vendors. As a result, it’s also one where there is most pressure for change from competitors and customers—and one of the most complex areas to figure out how to achieve interoperability.

David Kelf, vice president of marketing at OneSpin Solutions, said chipmakers have resisted this effort, continuing to mix and match tools from different areas. They also are becoming more vocal in standards groups such as Accellera, where end-user participation is growing.

The clearest progress is on the API front end for the binary front end to debug databases so that tools can be interoperable, and giving users a common way to read all the data. Mentor Graphics and Cadence said a year ago that they were starting what is currently called the Debug Data API effort. http://semiengineering.com/mentor-cadence-join-forces/

Dennis Brophy, director of strategic business development at Mentor Graphics, said that over the past few decades a lot of the work of making databases and tools friendly with others has been accomplished using APIs. “VHDL has an API. Verilog has one. And those have served the evolving markets. ASIC vendors needed to be able to program systems, and each did it in their unique way. It wasn’t cookie cutter to any extent. And these PLIs, VPIs, APIs have all played an important role. What they do is they allow innovation at both ends of the API. When you make a call for implementation at the API it is backed by its own innovation, and the consumers of the API that create their own applications are left to innovate, as well, so there’s relatively low friction in terms of how the market might want to adopt or push or evolve technology. APIs have served us well there.”

Interest in a similar approach for verification is rising, because it protects the investment that vendors have made in their own successful technology. But it’s also a hard sell to market leaders in this area.

Brophy noted that when the Debug Data API work was started, those involved did go through the process of discussing whether there ought to be a full read-write capability. “If you look at the OpenAccess design database, it has a lot of read-write capability in it, yet the know-how to write a database well comes with quite a bit of original invention that might be protected by trade secrets. That gives advantage to an innovator on one side of the API that is going to be difficult to share. If you ask the Big 3 EDA companies to open the kimono to their source code and let us understand it, those of us that might like to merge all three companies into one to make that happen might be run out of our offices.”

Just how open this process needs to be isn’t always clear. “We did go down that path of wondering whether the ability to read something requires to know how it was written,” Brophy said. “The answer to that is yes, you have to know how it was written. However, you don’t have to make how it was written — the code that actually wrote it — available. What you need to do is supply a read function by the producer of the information for the writer of the information. That ended up as the thesis and the basis for what we’ve been doing.”

Following the announcement last year, there were also discussions about how comprehensive the dataset ought to be, he added. “Could we start with very simple 1s and 0s — transitions that come out of a design — and keep it extremely simple? Then there was a discussion of whether we needed state strength in there, VHDL and Verilog to look at things a bit differently. Even if you look at some modern practices for PCB, which we are attempting to allow a graceful exit, rather than just falling short because the files are too large to pass around, can we in that regard support some of extended VCD concepts and notions we introduced on the VHDL side? We came to the conclusion, yes, it could. When we did that we also realized that we could either do some original invention and keep all of this completely 100% new, or we might be able to leverage a lot of what is already there.”

The specification borrows heavily from the IEEE standards that define a programming interface, the Verilog Procedural Interface, because all of the calls are there to get data information and figure out what the design looks like. So why reinvent everything when it is probably implemented in all of the tools that are producing verification results today?

“When you ask what’s missing, the VPI was architected to have a live simulator presence when data was being collected,” Brophy said. “What we’re looking to do is to use published simulation results, and we need to be able to perform the same functions, but perform those functions when the simulator is no longer present and you only have the dataset present. In essence, we need to be able to open and close a file, and those are not supported functions by the VPI for the simulator. There are a few other things that have to be done, as well, so we’ve outlined all of that in a spec that we think makes it very easy for everyone to practice as we begin to bring these implementations to market.”

Of course, much of this work is motivated in some early work done in the SystemVerilog standard from the first version in 2005, and a lot of that work was built on contributions that were contributed by Novas and Synopsys, he added. “The DNA of this thinking, and the DNA of where we are today, is motivated by a lot of effort that has already gone into thinking of what should be there. The unfortunate thing was we all recognized no one was using the API and the 2009 version of the SystemVerilog standard we elected to drop that from the standard. Now the question is. ‘Do we come back and put it back in there, or do we even make it more broadly applicable not just to the Verilog or SystemVerilog language, but fully comprehend the VHDL language, SystemC, A/MS?’ Should a working group be tied to one language, or should we be trans-language friendly with all the languages? The current thinking is that everyone is welcome.”

Growing pains

In trying to understand how to make standards work together, it might be helpful to keep in mind that when languages first start out, there can be a number of growing pains experienced as the kinks get worked out of a technology. Adam Sherer, verification product management director at Cadence, said that in the early days of SystemVerilog there were different language interpretations, different extensions, and it wasn’t uncommon to see incompatibilities in code from one version to the next.

A combination of ongoing work in the IEEE, as well as technologies like UVM normalized that. In addition, each of the vendors gave a little, and each learned to accept a bit of the other guy’s interpretations of the standard.

“From a language perspective, most of that is behind us now because the languages have stabilized,” Sherer said. “One area where we have some potential for interoperability challenges is with debug data. The [Debug Data API] effort is primarily there to enable common read from a database, because the databases need to be tuned per simulator, or for a tool (like formal verification), and everybody has a database tuned to meet the performance and data output of the engine. What we are looking for is a common way to read it because we each have different ways of interpreting, visualizing and debugging that data.”

But there is an interoperability issue because the APIs on each database are proprietary, which is why there is a push for a Debug Data API.

Sherer explained that as a result of their own engineering work, users discover a need or a use for data that the vendors haven’t yet provided. “They find some way of interpreting it post- or pre-processing it, so they’ve always needed API access. We tend to provide those. The interesting thing comes when we try to start mixing the engines together, and trying to get common data out, because you have not so much differences in compiler front ends — honestly, that’s the easy part. The more difficult part is the actual generation of data. Take code coverage, for example. What do lines of code mean in hardware versus simulation? Is that a different interpretation of it? You have an engine that is two state logic only on the hardware side, and four state logic on the simulator side.”

How to combine that data isn’t always clear, he said. “If you’ve covered an X value in one engine, there is no X value in the other engine, so how do you merge? That becomes proprietary knowledge in how we go about merging that information, because it is complex and it tends to be engine-specific. And then with formal verification and simulation, merging results from those two again requires deep collaborative engineering from the engine teams, and collaboration with customers. What we end up combining into a common view needs to be something the customer is going to agree to, understand, and make use of, so these tend to be iterative. Are there future standards efforts that pop out? Possibly. We don’t engineer away from that, but the challenge is, first, identifying solutions for some pretty complex engineering tasks.”

Where software touches hardware
Verifying the hardware/software interface of an SoC is becoming a bigger problem because of the increasing amount of software content. This is an area where standardization would be especially welcome. “Certainly the verification engineers, as part of their job, need to verify that the hardware and software interface is correct, and they need to use it to test everything else,” said Richard Weber, CEO of Semifore.

Weber said there are two aspects to the hardware/software interface that impact verification engineers. “As a foundation you have to be able to program the chip to put it into a scenario you can test. As a result, there is a body of work as far as the registers and how they work, and there are several industry standards associated with that right now including IP-XACT, SystemRDL and UVM.”

Verification engineers would benefit from an open standard such as OpenAccess to write an API and extract the information about the design, and perhaps direct verification automatically based on that information. But the issue Weber sees in the hardware/software space is that everything is a little bit green. For instance, the data models for the IP-XACT, SystemRDL and UVM standards are different enough that they don’t interact smoothly, and that is a problem that needs to be solved.

“Certainly the industry would benefit from that. Is there a way in verification that there is a common database? UVM, at least for the register space for the hardware/software verification, has a database of information that the design and engineering teams are querying in the middle of their verification runs to direct their tests to do certain things. That’s happening today. We would like it to happen better going forward, but the standards are moving slowly and currently are a little scattered. This gaggle of standards is all in the same Accellera family, and if Accellera committees could agree with one another that would help,” he said.

The way it stands today for the Debug Data API, there is an ongoing discussion as to whether it will become a [dot] standard on SystemVerilog or a development in Accellera that becomes a [dot] standard. A proof of concept is still being worked on, and from there, it is believed that interested users will push for a formal standard effort. In the meantime, a demonstration has been prepared by the involved vendors.



Leave a Reply


(Note: This name will be displayed publicly)