Standards, Open Source, and Tools

EDA has been more successful creating open languages and standards rather than promoting open-source collaboration. Will this change?


Experts at the Table: Semiconductor Engineering discussed what open source verification means today and what it should evolve into with Jean-Marie Brunet, senior director for the Emulation Division at Siemens EDA; Ashish Darbari, CEO of Axiomise; Simon Davidmann, CEO of Imperas Software; Serge Leef, program manager in the Microsystems Technology Office at DARPA; Tao Liu, staff hardware engineer in the Google chip implementation and integration team; and Bipul Talukdar, director of applications engineering for SmartDV. This is an adaptation of a panel session held at DVCon. Part one of this discussion is here. Part two is here. The last part of this expert series contains answers to questions posed by the audience but not addressed during the session. Those answers call on additional experts.

SE: UVM is open source, and has been successful and beneficial for both users and tool vendors. SystemC is another successful open-source verification tool. Are these the models for future tools?

Lui: It doesn’t really matter what language you are using. Behind everything is constrained random verification, or formal verification. It’s unfair to say that we want to get the same feature set and performance from the open-source tool compared with commercial tool. The investment is totally different. You can build something that is good enough, with reasonable performance and lower costs. You don’t have to do an apples-to-apples comparison against the big commercial EDA companies in order to succeed. You can do something in the middle and make something successful.

Talukdar: When you look at today’s design methodologies, things are changing. People are designing at a higher a level of abstraction and generating designs. RTL should not be the primary way of encoding a design. The tools do need to evolve, to be able to handle something at a larger scale, where you will be generating your designs from a higher level of abstraction. You should be able to do verification at that level. You should be able to generate tests, as per the specification. Today, RTL is the main entry for design, and that’s the assumption of all the tools. We need to modify the tools to be able to handle something such as the design methodology defined by BlueSpec. There needs to be a revolution in the tool’s world to be able to verify at a higher level of abstraction and generate test together with designs.

Brunet: We do high level synthesis from C to generate RTL. That produces very good quality RTL that can be executed with a simulator and emulator. We see a trend where very large companies are spending less time building RTL, and work more at high-level modeling.

Davidmann: The free tools business is a service model. You’ve got to find somebody who’s going to develop technology, and the way that technology is developed in EDA is through startups. That is funded by commercial adopters. The service model never really builds anything more than just adequate. That is a fundamental, structural problem. If all companies just fund free tools, you end up with adequate. Whereas if you have startups, they become world class.

Brunet: As the only large EDA vendor on this panel, I can guarantee you there is innovation within large EDA companies.

Darbari: I want to mention Yosys with respect to formal tools. It is an example of a great formal verification tool. But am I going to use it tomorrow to sign off my design? Maybe. I’m not sure that Yosys is at the same level as the commercial tools. So I’m not saying we will rule it out, or that we won’t support it. In fact, we would be prepared to help in the journey of Yosys becoming a better tool. The OpenHW idea is to use production grade verification. And that’s what I think we’re talking about here. Using tools that give you production grade quality.

SE: The industry has attempted other open projects, such as the Unified Coverage Interoperability Standard. Coverage has been brought up a few times in this panel as a necessary part of any verification methodology. But UCIS hasn’t been particularly successful or adopted. Why are some open projects not successful?

Darbari: We can learn from what didn’t not work and try and make it better next time. I’m certainly hopeful. But we need all the big tool vendors, and the small ones, to work together on that one.

Lui: We need to find common interests that bring together tool providers and users. Only when you get to that do you get timely support for open source. How do you get the necessary commitment? These are all important questions, because verification is a serious business. You don’t want problems in your chip, so you have to do production-level verification. We have to build on common interests. You get success only when you can achieve verification quality.

Leef: Open source has appeal for several reasons. I have a community that likes to extend capabilities, and do so without exposing their extensions to the outside world. Open source allows them to do that. My cost of underlying computing is close to nothing, so the dominant cost becomes $50,000 per license of a commercial simulator. If I can eliminate that, it solves a lot of challenges that I have. And as far as getting an EDA company funded these days by private sources, I’m sure many of you have tried. There is no risk capital available for this endeavor. This suggests a lack of belief in the capital markets, in innovation, and the potential in this space. I, however, am open to interesting ideas. I’ve invested over $110 million with the sole purpose to resuscitate innovation in the EDA space. If you guys have good ideas, I don’t want your equity, and I don’t want you to necessarily be open source. I’m interested in moving the technology forward.

Brunet: One of the challenges with open source is that a lot of things become common and exchangeable. That provides a lot of freedom and is a good thing. But at the same time, it makes it difficult to compete. It’s difficult to show something that is significantly better than others if the source is common for everybody. So how does the EDA industry continue to compete if a lot of things become open source? We need to compete, because we need to drive each other to be better. You need to maintain a level of competition to ensure that innovation continues.

Davidmann: Are we talking about free lunch, or free beer, or freedom. DARPA says they want open source. But they don’t want open source. They want free. We have to be clear on that. We need open source because we need the freedom to do things. It doesn’t have to be free. We need to be careful when we talk about free versus freedom. There are many companies that sell RISC-V IP. The ISA is free, so why are you selling your products? Simply because companies need quality processors. The value in open source is the freedom to innovate. We make a free simulator and give it away. It is closed source, just like Google Chrome, which most of the world uses. That isn’t open source. But people use it because it’s free and it does the job better than anything else. That is what DARPA should be looking for — technologies, not open source, unless you want the freedom.

Leef: I couldn’t agree more. Open source helps innovation, but I don’t think it’s a solution to the economic problems. What I’m having to deal with is the structural issues that exist within the government ecosystem. And this seems to be a reasonable solution to some of those problems, though not a complete solution as you highlighted. Making this stuff available to geopolitical adversaries is not a great idea.

SE: What has more value when considering RISC-V verification. Open-source tools or an open-source verification environment?

Davidmann: Going back to OpenHW, they have about five companies that have spent the last year building the verification test bench, a UVM test bench, for this RISC-V core. It’s a relatively simple four-stage, in-order core, without floating point or anything very complex. Still, it took four or five guys from different companies collaborating over a year using commercial tools. That test bench, the source of it, and everything is available for free on their GitHub. You can’t run it unless you have a Verilog simulator and the other tools that are necessary. They use the cloud, they do it all with regression farming. The interesting thing about OpenHW is they want to build the best quality core that is open source. You can download it and use it. Now the great thing about RISC-V is you can extend it and change it. If you do that, then you really need to re-verify it, and you still have the verification testbench that comes with all the makefiles, comes with all the tests, comes with all the compliance, comes with all the documentation. You pretty much can push a button and it will run. All of the efforts of the test bench are free and open source, but it’s not free to run it because you need your HDL simulators.

Lui: If you get an open-source testbench, that gives you something that perhaps gets you to 95% of your verification goals. That’s a big saving. But people will not just use an open-source solution as their only solution. They also have their in-house solution. If they can find and eliminate additional bugs, that is a big win because it complements their in-house solution. There’s definitely value for the users. In terms of the tooling, that basically depends on the availability of the tools that people want to use. You have to find a sweet spot in terms of cost and quality. You don’t want to compromise the verification. If you have to use a good tool to do something, you have to do it. If it’s available as open source, then that’s great because that will give you additional savings.

Darbari: It doesn’t matter if you are talking about tools or services, there’s no free lunch. We contributed to the OpenHW work. We used our app to formally verify their core every week. If there are any changes in the design, we were able to turn it around. We did all the work for free, the services were free, and thanks to the EDA tool vendors who donated tools to us, we were able to use it and check the results on all the tools. But building formal verification testbenches — much like UVM test benches that generate high quality proof convergence and give you predictable results — requires expertise and innovation. We can’t be doing this for free forever.

Talukdar: If the testbench is open, the user can modify it. If the user can modify it, I will take that and make it work with something like a Verilator to have a completely free solution. That’s how I would think about it. But somebody has to do the job, so you have to pay for that.

Leef: I don’t think anything should be free by definition. People who have put effort into doing something should be rewarded for those efforts. What I would assert is that in EDA technology there has been very little innovation in the business models around EDA. It’s not really a question of free. It’s finding the point where your value add, and the customer’s ability to execute business with you, need to match. And that may be free with support that’s annualized, or it may be some kind of freemium business model, or maybe deferred revenue. The concept of free is not really a relevant concept here. It’s an artifact of the business model. If people are doing stuff as a hobby, and they produce free things, the quality expectations should be correlated to what you get. I don’t think you can impose that anything should be free. If something can be free, as a result of an effective business model, and it benefits the producer of the value and the consumer of the value, that’s a great economic equation.

Brunet: Open to me does not mean free. It means freedom. There needs to be a shift in where the spending happens. You get what you pay for. So if you pay less, it may be something with less quality, or there may be a hidden fee. You may have to spend more time and engineering costs to validate the quality of what you’re getting. So it’s a shift left in the decision process of where the money is spent, and how the money is distributed, but the amount of money spent is about the same. It’s just distributed differently. So to me, open doesn’t mean free. It’s just it’s more freedom.

The last part of this series will tackle some of the questions that were not addressed live in the panel.


Kevin Cameron says:

An open-source flow will be C++ bleeding into TL-X, with AI driven circuit synthesis using analog simulators like Xyce (free courtesy of DARPA etc.).

RISC-V is the last gasp for RISC – things go to free before they disappear completely because they aren’t worth spending money on.

Leave a Reply

(Note: This name will be displayed publicly)