Verification In The Open Source Era

What does open-source verification mean in the context of a RISC-V processor core? Does it provide free tools, free testbenches, or the freedom to innovate?


Experts at the Table: Semiconductor Engineering sat down to discuss what open source verification means today and what it should evolve into, with Jean-Marie Brunet, senior director for the Emulation Division at Siemens EDA; Ashish Darbari, CEO of Axiomise; Simon Davidmann, CEO of Imperas Software; Serge Leef, program manager in the Microsystems Technology Office at DARPA; Tao Liu, staff hardware engineer in the Google chip implementation and integration team; and Bipul Talukdar, director of applications engineering for SmartDV. This is an adaptation of a panel session held at DVCon.

SE: A lot of people like the concept of open-source verification environment, but what is an open-source verification environment?

Brunet: What does open-source verification mean to us, Siemens EDA, and more precisely in the hardware assisted verification domain? It is really business as usual. You need views, you need RTL that compiles, you need a test bench that runs, you need Qemu so you can do hybrid emulation. It’s not too different from the perspective of a hardware-assisted verification, emulation, and prototyping provider.

Darbari: Open source doesn’t mean everything is free. In the context of test and verification, tools that are open source do not automatically translate into high quality verification. Using these tools takes expertise to develop, and a lot of human effort. And in the context of open-source design for hardware, it’s a lot different than software. Software can be patched, whereas hardware can’t. It is very important to emphasize the quality side of verification. What is really important in the context of open source, and much less so in the context of proprietary development, is this aspect of visibility, transparency, and reproducibility without getting vendor lock-in. It is important that any tool could be used to reproduce the results, the verification plans. Everything is transparent.

Davidmann: Our focus is software and helping people get software up and running. What happened with RISC-V, which is just an ISA (instruction-set architecture), is that we realized all of the interest around RISC-V wasn’t about software. It was about how people could design the processors. We switched to having a verification focus around RISC-V. The question for me is about quality open-source hardware. When you are building modern SoCs that contain arrays of processors, the challenge and the costs aren’t in the verification tools. It’s actually getting the verification done. In reality, you should buy, borrow, beg anything you can, to get better quality. If it is open source, that’s great. If it’s commercial, you should use the best technologies available.

Leef: I was never a particular believer in open source, open-source verification included, when I was in commercial EDA. Moore’s Law is at the end of the road, so we can’t expect processor clock rates to keep increasing, and thus simulation performance has stalled. People are thinking about distributed computing to support simulation and verification, going to the cloud. When you go to the cloud you have multiplicative need for instances. When you have a need for million licenses, the cost per license becomes prohibitive. In the defense ecosystem, which doesn’t have any coordinated purchasing capabilities and very complicated contracting, those people seek out open-source verification because of cost. Should we be looking at open-source for digital simulation, specifically event-driven simulation? We need to rethink the simulation kernel so that it maps better on to the cloud, so that it can scale linearly with the addition of instances, and so that we can use HPC strategies. So in my community, open-source verification is driven by costs.

Liu: We have built this verification environment, based on SystemVerilog/UVM, and also based on constrained random. That means you have a lot of tool dependencies and you cannot run it with an open-source simulator — at this point. But if you open source a tool, as we did with our stimulus generator, then you are adding value into the community. People already have a license. They can use it, they can customize it. If we were shooting for totally free, end-to-end verification, there is a lot of tooling you have to support — regression, coverage, many types of tools. This will take longer and will put restrictions on your verification methodology, because you may not have the advanced features found in the commercial tools. It would be great to have things for free, but that doesn’t mean open source has to be free.

Talukdar: Open-source verification is in a race with open-source design development. Given all of the continuous integration in place, and the number of people contributing to open-source designs, it has increased the pace of hardware design dramatically. Now the challenge is for verification engineers to match that. They need to be able to verify what is in the specification. That is a huge problem to solve. People who want to use open-source design need to explore what is out there, and once they select something, they have to verify it. That is where the problems start. For verification, there are service companies available who monetize the market, but getting verification done is not exactly business as usual for open-source verification. Open-source verification in the first place was created to address a different problem — to contribute towards the greater good. It is the contribution from different engineers around the globe to build better technology.

Brunet: Open source doesn’t mean free. If everybody thinks about open source just like free access to IP cores, it’s the wrong way to look at it. There is an aspect that is commercially more advantageous, but it’s not about being free. There is no free lunch. It is about interoperability, having access to, and exchanging information more freely in the community. It’s not about having access to things cheaper than if I buy cores from big companies that sell cores. That’s the wrong approach.

Davidmann: I’d say that having open-source verification solutions gives you a lot more freedom. I didn’t have to pay Google for the generator that they built, but when we are working with customers on their vector processors for RISC-V, we made use of their technology, which was provided under open source. It wasn’t the fact that it was free that was interesting to us. It was the fact that we could extend it, and change it, and make it do the job that we needed it to do. We built on the work that his team has done. It gave us the freedom to actually accelerate what we were trying to achieve with our customers.

Darbari: There are two themes here. One is driving integration collaboration and innovation, and the other is the cost side of it. On the second theme, we have to ask about who pays for these so called cheap or free simulators or tools? What about the people who are doing the verification? Somebody has to pay for them?

Leef: The U.S. taxpayer would be paying for part of this. We are investing heavily in open-source EDA and open-source IP programs. The reasons for that are multi-fold. One is to fuel innovation within EDA, which frankly has been stalled since 1988. The other is to enable under-served design communities to benefit from scalability in the cloud. You incur human costs to hire and employee engineers. You pay for that. You obviously have to set up your verification session, and simulation is just one element here. There are testbenches, and there are many way to distribute those in the cloud, and today that frequently is done manually. It would be nice to have that automated. So there is no difference as to who pays. It’s just that when I have a customer that’s running 1,000 instances, they don’t have to pay for 1,000 licenses.

Darbari: The private sector does not have that luxury. There are no taxpayers paying for my cost of development. We made a formal verification app for RISC-V. It’s the only app in the world that works with all the formal verification tools on the market, allows people complete freedom, no vendor lock-in. And yet people expect this to be free, because it is part of delivering value to the open-source RISC-V ecosystem. Silicon vendors do not offer their processors for free, and they should be paying for the services and tools that they use.

Leef: If you’re adding value, the market should recognize that and you should be compensated. Incidentally, the U.S. taxpayers are helping you, as well, because the open source that we are investing in would be available worldwide — for now.

Part two will consider whether RISC-V processor verification will provide common ground to develop a new verification methodology, and whether that will naturally lead to new and potentially open tools.


Ali HOUADEF says:

Unfortunately, there is a lack of reliable open-source TCAD tools.
Most open-source TCAD tools are for pedagogical purposes, very limited and/or stuck at an unfinished state.
It is a shame on the semiconductor industry and community. We can learn a thing or two from -say- the CFD community.

Leave a Reply

(Note: This name will be displayed publicly)