The Single Greatest Opportunity For Open Source

Jumping straight to a discussion about the implementation of open-source verification tools misses one of the most important things the community could do.

popularity

Next week, I will be moderating a panel at the virtual DVCon on the subject of open-source verification. I thought it would be good to advertise the event on LinkedIn to see if anyone wanted to send me well-structured questions for the panelists. What happened surprised me a little because the discussions almost exclusively went to the need for open-source verification tools. In my opinion, they are totally missing the biggest opportunity.

The RISC-V ISA, and the many implementations of it, have re-ignited the open-source discussion within the semiconductor space. But RISC-V is a piece of IP, and to implement that in silicon requires a long list of tools to bring it to realization. The value of those tools is somewhere in the vicinity of $10B. While it is true that many of those tools are available as open source, they rarely are used outside of academia. It is very clear why — the cost of a chip failure is so high that spending just a few percent of the cost of development on tools reduces risk enough to justify that cost.

So is functional verification fundamentally different? There are certainly fewer tools involved:

  • Engines that can execute a model at a defined level of abstraction;
  • Formal verification that can exhaustively show differences between two models, one possibly described using a set of properties;
  • An environment that helps with the creation of stimulus and checking the results of a model execution;
  • Tools that track progress towards your verification goals; and
  • Productivity aids such as debuggers and linters.

And yet, even though these are all based on standards, primarily SystemVerilog and UVM, there is little agreement among system companies about the best way to perform this task. Ask anyone what the “Universal” in UVM means, and they will tell you it just means something so broad that it encompasses everyone’s methods.

Going back to RISC-V, it appears that few people even think about open-source verification in the same manner as the hardware. They do not think about a set of stimuli, or the means to generate those stimuli, and to check that an implementation of RISC-V matches the specification provided by the standards organization in order to provide a certain level of confidence. The industry groups do not even have the notions of conformance testing figured out yet, which is a far simpler problem.

Perhaps defining the conformance test is a necessary first step. That would force the industry to actually work on some best practices that are common across the industry. Then perhaps we wouldn’t need something as “Universal” as UVM. Instead, we could have something that is much more focused, much leaner, much easier to implement as open source.

By jumping straight to a discussion about the implementation of open-source verification tools, we are perhaps missing the most important thing that the open source community could do — bring about some common practices, enable common verification IP, common notions about what completeness means. It really is time that we take some of the “art” out of verification and employ the mind-power of the industry to figure out some better ways to do verification.

The stated problem of functional verification cannot continue to grow at the square of design size. I don’t believe it ever grew anywhere close to that, but I will confess that I was never clever enough to define the way to stop that from being the case. This is perhaps the single biggest contribution that the open source community could have to the design of semiconductors. Once that is done, open-source verification tools will be easy. But start with the tools and I do not believe you will ever be successful.



7 comments

Matthew Ballance says:

Brian,
I certainly agree with your take that verification methodology is key, and a focus on open-source tools without focused open-source methodologies is, at-best, cost optimization. My feeling is that, while open-source tool flows and open-source methodologies are separate considerations, they are not disconnected. Specifically, the existence of an open-source reference tool flow would greatly help in developing open-source methodologies that, of course, work both with open-source and closed-source/commercial tool flows.

Theodore wilson says:

Brian thanks again for another insightful article. I similarly think that open source is being conflated with software license prices in a hope that open source implies verification at a lower price. I also don’t think this is a tooling problem. Discarding the commercial interests, staffing competition, costs and schedules of IP vendors and integrators I imagine independently reproducible verification results might let teams independently assess, share and improve test quality at an incremental spend better than either blindly trusting IP or fully retesting. But in the real world and this is a hard hard problem. Could this space be akin to independent financial audits? Maybe there is demand is a verification equivalent of the accounting firms.

Tudor Timi says:

I’m more interested in how open source verification assets can benefit the development of closed source IP. Something like “let’s collaborate on open source verification IP, but let’s compete on closed source implementations”.

I’ll extend it to also contain “using closed source tools”.

Tushar Kalamdar says:

Just like software Engineer development ,open source hardware tools can help display such good ideas and methodology from people or students who don’t have access to such costly platforms.

Lars Asplund says:

Brian,

I completely agree that the costs and risks associated with ASIC development are significant but if we flip the coin and look at the other half of the population, the FPGA developers (45% of the participants in the latest Wilson/Mentor/Siemens study), the cost/risk analysis is completely different.

To build an FPGA from RTL I need a Vivado or similar which is between zero and a few thousand dollars depending on device. In my experience, the first additional tool FPGA teams acquire is a standalone simulator and once that step is taken the verification tools starts dominating the EDA tool budget.

While the cost of an ASIC chip failure is enormous the cost of building a faulty FPGA is limited to the time of building the design and see it failing the target tests. Don’t get me wrong, this cost still drives the rigour of the HDL verification but in a completely different way. The driving force is verification effectiveness and not the risk of complete project failure. I don’t want to build an FPGA for 1h to find a bug in a target test that could have been detected with a short RTL simulation. OTOH, it doesn’t make much sense to run a 24h system-level simulation if I can run the same test in SW in a few seconds.

Verification effectiveness is also a reason why more an more RTL developers use (SW development) methodologies such as test-driven design and continuous integration. This is an example where the open source community has taken the lead in finding other ways of doing verification with tools like SVUnit and VUnit. Common to these methodologies is that they imply running a lot of automated tests all the time. Compared to SW, RTL simulations are slow and to get a SW like experience you need to run many tests in parallel which requires more licenses and makes simulator tool cost even more dominant.

Most FPGA designers today use VHDL (64% according to The Study). They rather not switch to SystemVerilog to do their verification and their managers don’t want to invest in the education for doing so nor increase the tool budget further by investing in the premium simulator licenses required for UVM. All this makes FOSS tools extremely interesting regardless if open source IPs are used or not.

If we just look at VHDL simulations we have GHDL which is FOSS and supports a subset of VHDL similar to the commercial simulators. Given a verification intense approach to FPGA development it becomes very hard for a paid license to compete when it comes to pure automated testbench execution. The added value comes with supporting mixed-language simulations to handle Verilog IPs, supporting encrypted IPs, and nice GUIs for debugging.

Many of the FOSS tools are developed by professionals and are used professionally as well as by academia. Just looking at job ads for RTL developers shows that they are used in all market segments, including critical applications such as space, defense, automotive safety etc.

Mike Thompson says:

“The industry groups do not even have the notions of conformance testing figured out yet, which is a far simpler problem.”

With respect Brian, that is not factually correct. Both the concept and the implementation of conformance testing are maddenly difficult challenges. Worse, conformance testing is insufficient to fully verify a design.

Brian Bailey says:

Thanks for the comment Mike. I say it is an easier problem for two reasons – first it only has to verify conformance to the ISA and is devoid of any implementation issues. So it does not have to deal with issues like interrupts that are outside of the specification. Second is the notion of completeness. Conformance testing can be as casual or formal as the organization providing the conformance stamp wants. This is a lower bar than full verification. Now, perhaps I was wrong to indicate this this is easy – but it is not as difficult as verifying an implementation to the level required to be committed to an ASIC.

Leave a Reply


(Note: This name will be displayed publicly)