Will Open-Source EDA Work?

DARPA program pushes for cheaper and simpler tools, but it may not be so easy.

popularity

Open-source EDA is back on the semiconductor industry’s agenda, spurred by growing interest in open-source hardware. But whether the industry embraces the idea with enough enthusiasm to make it successful is not clear yet.

One of the key sponsors of this effort is the U.S. Defense Advanced Research Projects Agency (DARPA), which is spearheading a number of programs to lower the cost of chip design, including one for advanced packaging and another for security. The idea behind all of them is to utilize knowledge extracted from millions of existing chip designs to make chip engineering more affordable and predictable.

DARPA’s goal is to make sure the U.S. military can afford to design and build the chips it needs for increasingly complex communications, detection and weapons systems. To spur the effort, the agency is pitching the same benefits to other organizations.

“If you look at the $100 million price tag of an SoC, it’s not the EDA tools that are driving up the cost,” according to DARPA program manager Andreas Olofsson. “It’s not the tape-out cost. It’s the engineering that makes up the bulk of the cost, which goes across design, architecting software and so forth. We have to figure out a way to reduce that engineering complexity and cost. How do we do that? We look at automation and we look at better reuse of IP—not just component reuse as we do now, but subsystem reuse that is easy and inexpensive to use.”

Olofsson left his job as CEO of Adapteva in 2017 to lead two projects that affect the EDA industry, Intelligent Design of Electronic Assets (IDEA) and Posh Open Source Hardware (POSH). The goal is to use a vast database of existing chip designs as a source of data about what works and what doesn’t, which will help machine-learning-accelerated applications automate the process of bug hunting, verification and layout. The IC database would include plans for more than 5 million components already in circulation, standardizing model names and roles to replace the non-standardized descriptions captured only in datasheets. That would allow data to be used for a constraint-based optimization system.

“There are a lot of people within the DoD, for example, who know and can write very good Verilog code that they run on FPGAs, but they don’t have the teams with the skills in-house to bring those to an ASIC implementation,” Olofsson said. “We need to bring that barrier as close to zero as possible.”

Both of these programs are part of DARPA’s $1.5 billion Electronic Resurgence Initiative (ERI), which has promoted chiplets, as well as simpler packaging and chip design. All of the program’s goals, and those of most ERI programs, depend on open-source software.

“We need to be clear about open source because it’s open to misinterpretation,” Olofsson said. “This is a very challenging problem. It is going to require a lot of people bringing their expertise to the table and a lot of collaboration. Seeing this work time after time—the latest example being with the development of machine learning—we’ve seen that innovation and collaboration are orders of magnitude more effective when you use open-source. Things that are disruptive, things that require many teams to collaborate and add components—open source is the only practical way to accomplish those things in a research setting. And, I strongly advocated for a permissive open-source license, which means anyone in the world can take that code and build on top of it, build proprietary solutions on top of it, and give back to the community only their basic code. So it’s a very gentle open-source license.”

Not so fast
The price and flexibility of open source make it attractive for a lot of companies, even many that don’t use open-source now. But there are good reasons to stick with a best-of-breed approach in many cases.

“EDA companies tend to focus on their big customers—companies like Broadcomm, Qualcomm, Intel—that are doing lots and lots of SoC designs for big markets and are motivated to create the best possible design, because, if they don’t, they are likely to lose market share because their products are too expensive and don’t work as well,” said Linley Gwenapp, president and principal analyst at The Linley Group. “So they work with the EDA companies to automate what makes sense because they want to minimize their engineering investment when they can, but not at the cost of creating an inferior product. For them, the upfront engineering cost is not as significant as the amount of money they’re going to make selling lots of these products on the open market.”

Oloffson contends that approach focuses the resources of a whole industry on the requirements of a small number of companies building the most complex processors, with the highest volume and lowest tolerance for error of any kind. But as chips find their way into more safety-critical markets, and as demands for reliability increase across the board, that may be a tough sell. And despite criticism that commercial EDA tools are difficult to master, there has been a concerted effort on the part of EDA companies to simplify every step in the design flow and to do more steps concurrently in order to reduce time to market.

In addition, much of the volume for developing chips is not happening at the most advanced nodes, where the most advanced design tools are required. FPGA vendors have been providing their own simplified tools for years, and the increased adoption of GPUs for AI/ML training uses tools that are well understood by most engineers.


Fig. 1: DARPA open-source projects. Source: DARPA

Disruption ahead
Few in the industry would disagree that chip design is too complicated and expensive today. The question is how to tackle that problem, particularly as markets begin to splinter or flatten and architectural changes begin to replace or supplement traditional scaling.

“Open source is still nascent on the hardware side of the business, but it’s starting to grow quickly,” said Robert Oshana, vice president of software engineering, research and development, microcontrollers for NXP Semiconductor. “There are a lot of parallels between how it grew in software and how I see the hardware side starting to evolve.”

He noted that open source isn’t just about RISC-V. “It is disruptive when we talk about it, but the industry will probably find a lot of benefit from it just as the software industry did, so they’re going to have to adjust.”

Others aren’t so sure. “If we knew what the benefits of this approach would be, we’d be firmly into internally funded R&D.” said Joe Sawicki, executive vice president at Mentor, a Siemens Business. “That’s what is potentially so exciting about a program like this. We swing for fences we have no certainty of hitting, and push for benefits that aren’t obvious.”

There are risks to go along with this approach, as well. Taylor Armerding, senior Infosec writer at Synopsys, penned a blog earlier this year about closing up a big loophole. “After almost a decade of development, SPDX (Software Package Data Exchange) will add hooks for open-source security vulnerability data.”

All the big EDA providers, as well as leading chip companies, are active contributors to ERI projects. In fact, Cadence, Synopsys, Mentor, NXP, Intel, IBM, Intel, Qualcomm, Arm, Nvidia, Analog Photonics, SRI International and Applied Materials all have contributed speakers and engineers or materials to ERI effort, most recently at DARPA’s 2018 ERI Summit.


Fig. 2: Who’s doing what. Source: DARPA

Most are not only involved, but pivotal to the open-source-dependent POSH and IDEA projects. Still, the key to getting industry players to accept open-source EDA is whether it makes the design process more efficient without breaking anything—and whether it is possible to extract decades worth of design experience from libraries of millions of existing designs and use that to spot errors in real time in existing designs.

“The research question for DARPA is how far we can push the complexity and how good we can get,” Olofsson said. “Smart designers have a lot of tricks. They’re embedded in thousands of designs that have been taped out. Right now, that is dark knowledge in the heads of the designers that is not anywhere near these EDA tools. The question in the program is whether that knowledge can be extracted from designs that have been done and embedded in a tool so future designs can benefit from that without someone having to go and interview the designer.”

At this point, that has yet to be proven. But at least for now, there is a significant budget for exploring open-source tools.

Related Stories
Open-Source RISC-V Hardware And Security
Part 1: The advantages and limitations of a new instruction set architecture.
Creating A Roadmap For Hardware Security
Government and private organizations developing blueprints for semiconductor industry as threat level rises.
Making Chip Packaging Simpler
The promise of advanced packaging is being able to integrate heterogeneous chips, but a lot of work is needed to make that happen.



4 comments

Hsien-Hsin says:

Only if you don’t care about PPA, design competence and reliability. Foundries are extremely unlike to provide DRM / PDK / techfiles to support Open source physical design tools, let alone CDN, SNPS, Mentor, ANSYS, etc.

True, generic never outdoes optimization by experts using specialized tools. DARPA’s intent seems to be to reduce the barriers involved by making routine parts of basic design more accessible and ease re-use of successful designs. It isn’t trying to automate optimization demanding, original work. If it has made progress, I’d expect that to be part of DARPA’s presentation at the ERI 2019 Summit July 15-17 in Detroit.

Alan Coppola says:

One easy to implement help would be requiring that all government granting agencies..DARPA, NSF, etc, require open-source delivery of grant collateral..and give evaluation points to grant proposals that have an open-source component.

Vernon Greer says:

I was strangely obsessed with DARPA’s ERI when it got started. Now, I keep thinking it is dogged by contradictory assumptions.

The baseline assumption is that it’s the engineering cost that prevents chip starts, not the tool costs. Yet, they also have a stated aim to have lots of open source designs that can be snapped together, jigsaw puzzle-like, to build a chip. Well, lower-grade professional PCB tools cost a couple thousand USD, and there isn’t really a lower tier IC design tool, so those will cost you something like an engineer’s salary or a multiple thereof. That’s a little steep for the hobbyists that would be building a large open source design library!

One of the big things holding up the would-be founders of new fabless IC houses is the cost of tools. It’s a huge barrier to entry. And a perfect thing for a deep-pocketed organization like DARPA to address. They are poking that bear a little in the digital domain, but I see a lot of pieces missing on the analog side.

Leave a Reply


(Note: This name will be displayed publicly)