EDA Forecast: More Clouds

Previous attempts at cloud-based tools failed, but much has changed since then.

popularity

By Ed Sperling
Design engineers and EDA vendors used to scoff at the idea of cloud-based tools, but no one is scoffing anymore.

A decade after the idea of renting tools online fell flat, largely due to security concerns by chipmakers, all three of the major EDA players and some smaller rivals are taking cloud-based solutions very seriously again.

There are several reasons for this change in thinking. First of all, the sheer complexity of shrinking geometries has increased the amount of data that needs to be processed by orders of magnitude. That’s particularly true on the hardware verification and software emulation side.

Second, capital expense and operating expense budgets at major chipmakers aren’t growing as quickly as the complexity, particularly when you factor in such things as power-performance-area tradeoffs and exploration. To do this kind of exploration effectively requires enormous processing power, and server farms are already to the point where it costs money to power and cool them—the same kinds of problems encountered by major data centers.

Third, in all of these operations there is crunch time and there is down time. The crunch time typically happens during software prototyping, verification and simulation, but there are significant gaps when that same level of processing power isn’t required. Cloud approaches provide some elasticity in how those resources are applied—in some cases even how the tools licenses are structured—leveraging economies of scale because not all companies are on the same schedule.

“Customers have been asking us if we’ll run tools for them in the cloud and we’re looking at it,” said Bruce Jewett, senior director of solutions marketing at Synopsys. “It seems to work around verification because of the elasticity of this model. There is more verification that needs to be done with a geometry shrink, and customers don’t necessarily want to buy all the tools but they do want to use the tools when they need them.”

Synopsys’ evaluation of this market is a rather sober one, given its failed attempt a decade ago to rent the tools to customers in a bid for market expansion. “We tried that and it was a dud,” Jewett said. “The economics weren’t there at the time. With shrinking geometries and the expense of equipment, tools and chip development the market seems to be ready for this.”

What works, what doesn’t
What’s also changed over the past decade is an understanding of exactly what markets will benefit from this type of approach, who’s willing to pay for it and how much they’re willing to pay. While this still may be an effective way of building a base of new customers—something of a Holy Grail in EDA—it’s also a way of growing the business of existing customers.

Cadence and Mentor also see an opportunity in this market and are evaluating it for the same reasons as Synopsys. One key part of that evaluation is what will work and what will not work using this model. System-level design tools for modeling and architectural exploration are considered unlikely candidates, for example. Hardware emulation is also unlikely to be sold this way. Software emulation, however, is a strong possibility because of the smaller tools budgets for software engineering.

The size of the engineering budget is a key factor to making this work. Altium has adopted a cloud-based model for IP in the FPGA space, where engineering budgets typically are much lower than in the ASIC space. By serving up IP and exploratory tools on a database the company is attempting a feat that has never been completely successful in this market—getting FPGA developers to buy tools that are not provided for free by the FPGA vendors.

And finally, startup Physware is offering signal and power integrity checks through a cloud-based model. In case that sounds like a radical departure in thinking for startups, this one is headed by former Synopsys CTO Raul Camposano.

Hurdles
This isn’t a simple shift, however. While the infrastructure is in place to allow cloud-based services, there are still some kinks to be worked out.

For one thing, customers need reassurances that security is sufficient. “People like the idea until they realize that the data is out of their control,” said Neil Hand, group director of marketing in Cadence’s SoC realization group. “This is the same process that enterprise customers went through with CRM (customer relationship management software) a decade ago. They couldn’t imagine they would ever turn over customer data to a company like Salesforce.com. Today they don’t even think twice about it.”

Some chip companies even have rules in place to prohibit sending IP over the Internet.

But customer names are generally small files. The amount of data involved in chip verification at 28nm is exponentially larger. And while there are reports of companies being able to move this kind of data internally using compression technology, this still has to be worked out on a commercial level so that latency isn’t an issue.

Conclusions
It’s highly likely that cloud-based trials will begin over the next couple of years involving all of the major EDA companies and many of their large customers. As Synopsys’ Jewett put it, “Customers see this elasticity as something they need to consider.”

But this isn’t going to be a sudden ramp-up. It needs to be evaluated on both sides, tested in small doses, and ramped up slowly as both sides become educated on the pros and cons in this market. So far, this is all theory. Reality doesn’t always conform to the rules.



Leave a Reply


(Note: This name will be displayed publicly)