Open-Source Verification

Sorting out what is meant by open-source verification is not easy, but it leaves the door open to new approaches

popularity

Ask different people what open-source verification means and you will get a host of different answers. They range from the verification of open-source hardware, to providing an open-source verification infrastructure, to providing open-source stream generators or reference models, to open-source simulators and formal verification engines.

Verification is about reducing risk. “Verification is required to answer the question, ‘Do you trust the piece of hardware you received?'” says Neil Hand, director of marketing for design verification technology at Mentor, a Siemens Business. “This is independent of it being open-source hardware, whether it is something you built yourself, whether it’s a commercial piece of IP. Now, when you also rely on open-source verification, the question becomes, ‘How well do you trust that verification environment?'”

But this is not just a concern for open-source hardware. “With open-source hardware intended for an integrated circuit (IC), the same issues are faced as with commercially licensed IP or in-house design work,” says Roddy Urquhart, senior marketing director at Codasip. “When you are manufacturing an IC, you face considerably costs from any error that results in silicon failure or re-spin, resulting in additional costs and lost time to market. Rigorous verification is required for the design, whether it is open-source or not.”

The open-source community has been disadvantaged by mainstream IP having not fully tackled the problem. The existing methodology relies on trust between partners. “When you buy IP, you usually get a very simple verification environment,” says Olivera Stojanovic, senior verification manager for Vtool. “This enables you to run a few demo tests or check configurations. You do not usually get the entire verification environment. With complex IPs, they don’t want to provide you with the verification environment, which is too complicated and potentially may provide insights that they might want to keep from you.”

The IP industry also has not made it easy to integrate verification deliverables into a system verification environment. “Whenever we integrate IP, we usually uncover quite a lot of challenges integrating them smoothly with the rest of our verification environment,” adds Darko Tomusilovic, verification director for Vtool. “Despite them all using a UVM methodology, we still don’t have a clear, well-defined methodology about how to integrate them. For example, they may use different notions of synchronizing the testbench and the software.”

When you bring processors into the picture, you also have to specify if you are talking about conformance testing to a specified ISA definition or the verification of an implementation. Given that the RISC-V specification is designed to be extensible, and allows for many architectural extensions during implementation, making verification frameworks equally extensible and portable can be challenging.

The semiconductor industry has tended to rely more on standard languages and interfaces. “I have been dealing with chiplet design, and the same issues can be found here,” says Chris Ortiz, principal application engineer at Ansys. “Standards are being defined by groups such as CDX or ODSA and I find that verification seems to be the least understood and the most debated. How do you go in and verify that these pieces of hardware are working?”

While open-source has been successful in the software community, its uptake in the hardware world has been a lot slower. “The difference is that the software world has worked harder to make abstractions clearer and cleaner,” says Tim Saxe, CTO for Quicklogic. “Part of the reason they can do that is they’re more resourced than the hardware world. The hardware guys are always trying to save resources, and that damages their ability to make clean abstractions.”

There are a lot of challenges that need to be resolved. “It does feel like the open-source market may have run slightly ahead of itself on the verification side of things,” says Colin McKellar, vice president of verification platforms for Imagination Technologies. “With increasing complexity in some devices and with the push of the software community to be more of an equal partner in the world of chip development, the mantra about open-source and openness are starting to push into certain areas. There are going to be areas where open-source doesn’t make sense, and then there are going to be people who say that open-source communities will fail for a variety of reasons, and they may or may not have the complete picture or may not be talking honestly.”

Can we do it with open-source tools today? “Open-source hardware has to be verified according to its target use cases,” says Sven Beyer, product manager for design verification at OneSpin Solutions. “If you want to use open-source IP in a university project, you don’t need a lot of confidence in functional correctness. But if you want to use a RISC-V core in a high-volume chip, high-quality, state-of-the-art pre-silicon verification is a basic requirement. Having a full open-source verification flow would be great, but this is not possible today. You need commercial simulators, formal tools, VIPs, and EDA partners providing high-quality support.”

Many eyes make for better verification
When talking about security, the more eyes that see something, the more secure it is likely to be. “We want to be able to go out into the market and purchase SoCs, where we have the same level of transparency and visibility that we do with our internal proprietary silicon,” says Dominic Rizzo, project lead at Google and project director for OpenTitan. “We’re looking for a high-quality implementation that offers us a specific set of security properties. We get more benefit by making it open, than by making it closed. We’re trying to do something correct – absolutely correct – and then show everyone that it is correct.”

The OpenHW Group is applying that same mentality to verification. “There’s a whole community of people that are looking around, poking the rock, lifting up the carpet and looking at what we have done,” says Mike Thompson, director of engineering for the Verification Task Group of the OpenHW Group. “I get e-mails from distant corners of the globe asking why I chose to do a certain thing in the test bench. It forces me to think about the rationale for doing what it is we’re doing. Those questions make the verification better every time.”

It takes time. “I heard, in the context of open-source technologies like QEMU, that verification is actually considered better because you can do fixes yourself, and by virtue of it being open-source, you have more people pounding on it,” says Frank Schirrmeister, senior group director, solutions marketing at Cadence. “I’ve heard that argument, but I must admit I have not seen real data about quality measured by metrics like bugs over time, or errata, to support it. It would be an interesting experiment to look at.”

Community is beginning to form, though. “OpenHW customers are trying to get quality silicon and they’re prepared to share and contribute to the testbench,” says Simon Davidmann, CEO for Imperas Software. “It’s all about building a high-quality RISC-V core using a test bench that people can clone and run themselves to get that confidence. From a dynamic simulation point of view, they are making the testbench and everything available. There are other companies in the formal space like OneSpin and Axiomise who are trying to help. Some of them aren’t open-source, but they’re working to improve the quality of the core.”

It is the framework that binds things. “It is possible to leverage an open-source verification infrastructure that allows for both open-source and commercial tools from multiple vendors,” says OneSpin’s Beyer. “OpenHW CORE-V users that integrate, or customize, a core will have a smooth process to leverage the best EDA tools and achieve rigorous verification for their chip. Formal is a key part of this flow, as cores, and other control-intensive IPs have lots of corner cases, and even a minor modification can create countless new corner cases that are impossible to foresee or hit even with the best constrained-random testbenches.”

It doesn’t all have to be open-source. “All of the stimulus is open-source,” says Rick O’Connor, president and CEO of the OpenHW Group. “You need to run it on your own commercial SystemVerilog simulator. We are building the same quality of SystemVerilog UVM based verification infrastructure that any large SoC company would do on their own. The benefit that they’re seeing is they don’t have to do it on their own. It’s being developed in a way that we’re integrating the best and brightest approaches across a number of different contributors. No single organization will have the breadth of coverage and capability in their verification test bench than we have.”

Imagination’s McKellar agrees. “There will be components put out there that will get super well tested because so many experts are looking at it. They will become some of the best verified pieces of work you’ll ever get. The more people, or the more ways that something is verified the more it is verified. So, if you can get more people to test and review and provide feedback, then you will have a higher quality.”

Compliance versus verification
Showing you are compliant to an ISA specification is very different than having a fully verified core. “How do people feel confident when they buy from a reputable company?” asks Imperas’ Davdimann. “Arm’s argument is that they have been to silicon 5,000 times. There’s probably about 50 or 60 open-source RISC-V cores available today. The first thing you’ve got to ask yourself is if it is a RISC-V that I’m being delivered. You should validate that the core is what you’re expecting, and the vendor should give you their simulator, the RTL and a pretty comprehensive test, and compliance suite.”

That is a work in progress. “RISC-V intends to provide functional compliance tests that include tests to verify emulators, simulators, soft cores and silicon,” says Mark Himelstein, CTO of RISC-V International. “The initial tests will include a test generator and those tests can be run against the formal model in SAIL for golden results. We are working on plans to provide these tests for a majority of the unprivileged base ISA and extend it over time to include complex instruction sequences, privileged specs and extensions.”

Beyond that, a verification environment for the implementation is required. “We have had to, and we are continuing to, do specific things in the architecture of the verification environment that allowed people to bring in, not just a CORE-V core, but any RISC-V compliant core,” says OpenHW’s Thompson. “Not just a RISC-V compliant core that adheres to the RISC-V ISA, but ones that support some specific instruction extensions or some specific coprocessor that they might want to add in. I wouldn’t say we’ve solved that. I would say that we’re aware that it’s something we need to solve.”

Minimizing verification costs
While the consortiums are attempting to solve these verification issues, adopters should heed advice from those in the thick of things. “If were talking about a RISC-V core, you get very little verification with it today, ” says Vtool’s Stojanovic. “The quality of the IP is not at the same level as Arm IP. In the calculation of the cost of the project, you need to take into account the extra time needed for the verification of the RISC-V IP itself, because it has not had the same amount of money invested into it.”

This will improve. “You can get quite a lot off the shelf,” says McKellar. “If you get the necessary test structures, it is quite easy to add randomization and structured frameworks on top of. So you can quite cheaply and quite quickly get yourself a reasonably good framework to build up a verification and validation framework. To end up with a complete verification framework and infrastructure to do everything really well is expensive and takes a lot of time, but if you’re building up and are willing to accept an iterative plan, then you just need to plan accordingly.”

That may mean having realistic expectation about the severity of changes you make yourself. “When you’re conceiving the design of your RISC-V core, the closer you can make it to the OpenHW core, that’s already been verified, or at least is in the process of being verified, there’s a number of advantages,” says Greg Tumbush, an engineer for EM Microelectronics. “One, the ISS is probably modeling it correctly. Two, the verification that they have done applies to you. Three, anything that you develop, as far as bug fixes, is directly contributable. If you have totally divergent cores, there’s no synergy between your company and OpenHW or the synergy starts dissipating rapidly. Every time you pull down the latest core, that has the latest bug fixes, you’ve got a huge merging problem ahead of you.”

Starting with standards
The EDA industry has invested heavily in creating a set of standards that drive the industry. The languages and methodologies used by most of the industry were defined by Accellera and then became full IEEE standards. This includes Verilog and SystemVerilog, SystemC, the Universal Verification Methodology (UVM), the Unified Power Format (UPF) for the specification of power intent and more recently Portable Stimulus.

Most verification efforts being undertaken by the open-source community are being based on these because there is a lot of expertise invested in them. “The bulk of our verification environment is written in SystemVerilog,” says Thompson. “There are some components that are developed in C or C++ but they have a SystemVerilog wrapper, so to the rest of the testbench they look like SystemVerilog. And we’re using the UVM because it is a well-worn path. If you’re talking to people that have UVM expertise, you can have a conversation about the architecture and the structure, the usage of your verification environment, and the other person automatically understands what you’re talking about.”

Unfortunately, this is not enough to ensure testbench portability. “We regularly hear from customers that want to change their verification environment, moving from one provider to another,” says Mentor’s Hand. “It is a not an insignificant amount of work required to do that. So it does lend itself to say there are challenges in creating a unified environment that you can easily move between vendors, or if you can easily move between your open-source and closed source tools.”

The complexity of SystemVerilog and UVM is creating a divide within the community. “Open-source tooling does not yet support SystemVerilog testbench constructs to the same level as commercial tooling,” says Daniel Schostak, architect and fellow, central engineering group at Arm. “There are open-source alternatives for doing the same thing in different ways – for instance, Python frameworks for writing testbench code in Python, rather than SystemVerilog. This reduces the functionality that needs to be supported by a simulator. However, this leads to various tradeoffs, such as the benefits of being able to employ experienced hardware engineers when using standard commercial tooling, versus being able to provide open-source verification IP to go with the open-source design IP that can be built upon with minimal dependencies on specific tooling.”

Others see this as a distraction. “One of the things that I’ve been distressed to see is a call to abandon the language as opposed to coming up with an open-source SystemVerilog simulator that really works,” says Thompson. “Our efforts should not go into abandoning SystemVerilog and trying to do verification in other languages. The big ones these days is Python. Just because it’s open-source doesn’t make it a good verification language. In my mind it’s going completely in the opposite direction from where the industry should be going. Introducing new languages to a complicated problem is not making the problem less complicated.”

Newly developed standard languages also are seeing slow adoption. “Portable Stimulus (PSS) would allow me to take the verification IP that came with hardware and re-use it at the SoC level,” says Stojanovic. “That would be very helpful and would enable much quicker and easier verification of the systems in which that IP is used.”

Other people in the industry would like to see this type of progression in verification environments. “We’re moving away from cores to MCU subsystems,” says Quicklogic’s Saxe. “Now you’ve got the problem of verifying that all those peripherals work with that core. People know how to verify the cores, but when they start interacting with peripherals is when life gets a little more exciting, and that’s what you actually need to work in practice. Antmicro has a co-simulation environment called ReNode. While the RISC-V community will produce a set of test vectors that let you test that your RISC-V core is ISA-compliant, people like ReNode will let you test to make sure that your core plays with all the peripherals wrapped around it.”

Providing point tools for verification can create other problems. “You need to think about the complete verification methodology,” says Kiran Vittal, product marketing director for verification at Synopsys. “Are you considering every aspect of verification? Are you looking to making sure that they form a complete verification platform? If you’re debugging in one environment, you should be able to quickly debug the same problem in the other environment, so you can have common debug methodologies. They also have to play nice in the development flow. You don’t want to wait until the last minute to realize that something in the UPF is not acceptable by your back-end place and route tool. Changes to your UPF means you have to re verify the design at RTL, then do re-synthesis and place and route. It just trickles down and there’s no end to it.”

Part two will look at the components necessary for an open-source verification environment and the ways in which they are being integrated into commercial frameworks.

Related
The Increasingly Ordinary Task Of Verifying RISC-V
Integrating an open-source core into a complex SoC is looking very familiar.
RISC-V Gaining Traction
Experts at the Table: Extensible instruction-set architecture is drawing attention from across the industry and supply chain.
Simplifying And Speeding Up Verification
Experts at the Table: The impact of AI, chiplets, and more precise interconnects.
RISC-V Challenges And Opportunities
Who makes money with an open-source ISA, the current state of the RISC-V ecosystem, and what differentiates one vendor from the next.
Open-Source Hardware Momentum Builds
RISC-V drives new attention to this market, but the cost/benefit equation is different for open-source hardware than software.



1 comments

Steve Hoover says:

As portability becomes an increasingly important goal and reuse of verification models increases, the role of the verification harness (connecting verification models to hardware models) becomes increasingly important. While the verification models can be reused, the harnesses are specific to each hardware implementation to connect as a signal level.

Leave a Reply


(Note: This name will be displayed publicly)