Do Single-Vendor Flows Make Sense Yet?

Design tool providers may want to lock customers into a complete tool flow, but is that best for users or the industry?


For many years in the EDA industry, there has been talk of a complete design tool flow from a single vendor, and each of the main EDA players is capable of offering one. But whether they actually do — or should — is an interesting discussion.

There are obvious pros and cons on the technical side. But it is the business and marketing issues that are really at the crux of the debate today.

“Even though we offer the tools to do everything, I’m still looking for the first customer in the world who goes single vendor for EDA,” said , chairman and CEO of Mentor Graphics.

Long-time industry analyst Gary Smith used to break out the 70 segments of EDA, which showed that the No. 1 supplier in each of those segments holds more than 70% market share. “For example, in Mentor’s case in terms of physical verification, think about how much money we spend on R&D when we have over two-thirds of the total market, compared to somebody who has less than a third,” said Rhines. “Economies of scale are enormous. The solution to the problem is interoperability, and any EDA company that fails to provide APIs that allow for seamless integration with third party tools is putting its future in jeopardy.”

, founder and CEO of Agnisys sees this come up often in big semiconductor houses, which are primarily in one camp or the other. “I see it from their perspective. Purportedly this way they don’t have to deal with multiple vendors, they get better deals, and they don’t have to deal with tool flow issues.”

However, he believes the negatives outweigh the possible benefits. “On the business side, single source means locking into the vendor and lack of leverage in the absence of competition. Some counter this with a multi-year deal, which further locks the customer to the vendor. Such multi-year contracts may seem lucrative on the business side, but they are horrendous from the technical side. A single-vendor, multi-year deal removes competition and makes the vendor complacent. It reduces the drive to innovate from the vendor, and closes the door for innovative smaller companies. Avoidance of tool flow issues is a worthy endeavor, but this can be achieved by open collaboration between vendors.”

“Single source means locking into the vendor and lack of leverage in the absence of competition.”

Today, engineering teams use integrated flows that are made up of tools from multiple vendors, Rhines said. But even if they didn’t, and those users tried to use a single vendor, next year there’s going to be a new problem that comes along that their current vendor can’t handle.

Furthermore, he said, nobody is best at everything. “One of the arguments that inevitably comes up is that technology is always changing. There are new entrants all the time, so of course there must be multiple vendors’ tools hooked together, but what about when it matures? That isn’t changing much. Take printed circuit board design. It was mature 30 years ago, and yet there are still multi-vendor flows required. People come in and find new types of noise analysis to do. Multi-chip modules are done basically with PCB-type software, but require a totally different approach, so even if you could get by not using best-in-class tools—the argument is always, ‘I go multi-vendor because I want to use best-in-class tools’—even the people who don’t need best-in-class tools need multiple vendors. That continues even as the technology matures.”

Another argument in favor of single-vendor flows is that if the same vendor’s tools are used for two different parts of the design, they are better integrated in ways that are not available in the API.

“I, of course, have heard EDA companies claim this, and wouldn’t be surprised if our own company makes these claims,” said Rhines. “The funny thing is that the claims continue, but the industry has a way of solving the problems. For example, we used to have people tell us synthesis has to be well-integrated with place and route, because you want to have the two coordinated. So we developed tools that allowed us to do that even if the customer was using a competitor’s router, or vice versa, with the synthesis tool. The fact is that we’ve been able to pass on those benefits so while that kind of thing is brought up, it turns out the industry has great convenience associated with standard interfaces. A netlist is a netlist, and you pass it off. It’s synthesized, and yes, you do want to be forward-looking in your synthesis to see where the place and route problems will be to handle congestion. We do that, but every EDA vendor recognizes they want to sell their tools into flows that will contain tools from other vendors, so they purposely optimize the tools regardless of which router or which synthesis tool is used.”

Frank Schirrmeister, senior group director for product management in the System & Verification Group at Cadence, observed that vendors don’t play nice with each other. For example, in the area of debug, each vendor has its own database. But the leading vendor commands the market, “and they are not really open about it, so the others are trying to break that non-openness for databases by adding layers, and so forth. That’s where the single-vendor flow, and the control the vendor has, often gets a negative connotation.”

He noted that in the case of Cadence, if an engineering team is using a tool like Cadence’s Perspec System Verifier, “we will guide them to use it with our engines by way of single flow integration. While Perspec works with other emulators and simulators, too, in a single vendor flow users can get additional advantages like close integration with debug and coverage. This optimization may be perceived as downside of the single-vendor approach, we are generally open to integration with others though through our Connections program. Every vendor with a portfolio will also optimize the business side of a flow …like going to Allstate and have your house, car, and life insurance with them, you get a discount. That’s what happens in those single-vendor flows from a commercial side.”

Rhines agrees. “Certainly there are benefits. We, as a purveyor of integrated flows, of course argue that the customer would do better buying more or even all of the flow from us, but we make clear there are best-in-class tools that we don’t make, so we pride ourselves on doing very good integration. A single company that wants to exchange design information frequently wants a standard they can exchange it with. That’s why we have SystemVerilog, Verilog, VHDL, because they are standards and it’s in our best interest to comply with those standards. Sometimes an EDA company will drift away from the standard to try to achieve lock in, and sometimes it works for a while.”

At the same time, Schirrmeister cautioned that stitching point tools together is not for everybody. “A big company will have a CAD team doing it for you. They will write all the scripts, they will write all the translators, they will write all the format conversions that may be necessary. But for a small company, that’s not always possible, and for them it is much easier to rely on the single flow from a vendor working correctly.”

Are we ready?

Even if it made sense, is the industry ready for a single vendor tool flow?

“Absolutely not,” said David Kelf, vice president of marketing at OneSpin Solutions. “There is a big difference between a set of tools, say, for office work such as Microsoft Office, and an EDA flow. EDA tools are an integral part of the development process, and different companies have different requirements for their tools, depending on a range of factors related to device type, the design itself, and the engineering expertise and methodologies. EDA companies have yet to provide the depth and breadth of capability required for all circumstances.”

Further, Kelf said that EDA must innovate continuously to keep up with semiconductor trends, both device and design, and this innovation may come from various sources. “If a company focuses on a single-vendor flow, almost by definition its methodology will be less competitive than that of a company leveraging a multi-vendor flow, where the best technology may be mixed and matched. As such, the multi-vendor company will produce better products with a faster time-to-market delivery, and this virtuous circle would guarantee this style of methodology.”

“If a company focuses on a single-vendor flow, almost by definition its methodology will be less competitive.”

, CEO of NetSpeed Systems, agreed. “Having a single vendor tool flow, even though it might sound like a great idea, is absolutely going to kill innovation. If you make everyone captive to one kind of tool flow you’re relying on that vendor to provide all the innovation, all the improvements and everything. Yes, it standardizes things. There should be standards. But multiple vendors should participate in that standard. So standardizing around a methodology or something where multiple people get to innovate around it is the right thing to do.”

The need for better standards isn’t confined just to EDA, though.

“I will draw a parallel here, one thing that really irritates me is that whenever I go into any conference room,” said Mitra. “I don’t know whether my laptop is going to hook up to the projector in that room because the connectors are constantly changing between the devices that we carry and whatever the device in the conference room takes. For a company like NetSpeed, why does this matter? It matters because as we go in front of customers, someone asks, for example, for performance simulations using C models, and someone else says, ‘Give us SystemC models,’ or ‘How about SystemVerilog, do you support that?’ So it would be nice to have some uniformity in terms of standards, but how you go about meeting the standard should be left to every vendor. That’s what sparks innovation. That’s how you come up with new things.”

Further, she questions if the market views a Cadence-based, Synopsys-based or Mentor-based system as an attractive option. “To a certain extent, one of the reasons why we don’t see more innovation in the EDA space — why you don’t see too many startups in the EDA space — is because the Synopsyses and Cadences of the world essentially have bundled together a bunch of these things. So even if a startup comes and says, ‘Hey, I have this cool little tool that enhances your place and route tremendously,’ they get competed out because the other guys have bundled it.”

“EDA is different from an office applications flow,” Kelf pointed out. “The tools often consist of both the user-facing items such as a debug environment, where a UI is a direct interface to the users, and core engines underneath the hood. These core engines are where much of the innovation is focused and the ability to swap these out for new technology is key for a competitive environment.”

Don’t forget training
Training is another issue, Rhines said. “Do you want to train all your people on a tool that is best-in-class and has all the capabilities, or do you want to train all your people on a tool that was developed by the same company that developed some of your other tools? There may be an economy of scale there, but you’re going to have some unhappy engineers that were working longer hours than they needed to, and producing design that were not as good as they could be.”

Many companies customize these flows with best-in-class point tools, or sometimes those that are “good enough” for their specific needs. They have a flow that is predominantly from one vendor, with other vendors’ tools integrated into that flow. And EDA vendors are careful about where they spend their R&D dollars to ensure they can get a reasonable ROI. The general rule of thumb is that to displace a tool requires at least a 10X improvement in performance or power, and even then it has to be seen as critical enough to make a switch.

“There were a lot of chip companies affected when Yogitech was acquired by Intel,” said Kurt Shuler, vice president of marketing at Arteris. “We asked why EDA is not investing in this area for functional safety coverage and the answer we got back was, ‘Is it enough to get customers to switch? If you look at EDA flows, they’re standardized, but there are specialized tools for things like formal verification and power integrated into those flows. If there is a best way to do something, it’s usually part of a standard flow.”

Rhines agreed. He said the reason a leading EDA company has 70% share in each EDA tool segment is that people end up adopting one tool. “In the Mentor case, Calibre is 40 different tools, and it’s the predominant one used. That makes training easier, but it’s not the only one used, and GDSII is the same for us as it is for our competitors. It’s just easier to use what everybody else is using because there’s such a big ecosystem that depends upon it. You hire people out of college, they all know Calibre. Yes, they could probably learn something new, but why don’t you just use the tool that is easiest to use, the easiest to train with? So in any one of the sub-flows, there are economies of a single company using single tools. That said, when mergers and acquisitions occur, it’s interesting to see that sometimes there will be pressure from above to integrate. But in general, if it ever happens, it happens slowly because once people know the foibles of a tool, they want to stick with it. They recognize that there are unique things that are different about every tool and they know all the problems as well as the benefits of the one they are using so they tend to stick with it.”

So while the semiconductor industry changes every couple years, and with every new market opportunity, the tooling tends to stick around a lot longer.

“EDA doesn’t change that quickly,” Rhines concluded.

Related Stories
Bridging Hardware And Software
The need for concurrent hardware-software design and verification is increasing, but are engineering teams ready?
Automating System Design
The impact of the chip’s changing role in the system is becoming clearer.
Way Too Much Data
Each new node and architectural change results in an explosion of data, making optimization of designs significantly harder.

Leave a Reply

(Note: This name will be displayed publicly)