Verification In The Cloud

Is the semiconductor industry finally ready for EDA as a service?

popularity

By Ed Sperling
Leasing of cloud-based verification resources on an as-needed basis is finally beginning to gain traction after more than a decade of false starts and over-optimistic expectations.

All of the major EDA vendors now offer cloud-based services. They view this as a way of either supplementing a chipmaker’s existing resources at various peak use times, or for small and midsize companies to license this technology on an as-needed basis without having to buy millions of dollars worth of emulators. How quickly the market grows from here is anyone’s guess, but EDA companies say attitudes have changed enough to justify their investments.

Big chipmakers have been working with internal clouds for the past couple years. That approach provides more granularity for prioritizing which engineering groups have access to which resources at certain times. Public clouds have seen slower adoption for a couple reasons, but even that is beginning to change. First, chipmakers have a tough time trusting anyone else with their data. Second—and related to that—they prefer to separate out some data for verification but not all of it, which makes the whole process more time-consuming and much more difficult to integrate necessary changes.

“People already put all of their most sensitive business data into the cloud, so this business model is becoming more acceptable,” said Frank Schirrmeister, Cadence’s senior group director for product management and marketing for emulation, FPGA-based prototyping and hardware/software enablement. “Now we are seeing more openness to put designs in the cloud. Flexibility of access to faster execution is beating security concerns.”

There are good business reasons behind this shift, as well. “Big companies have enough things going on to keep their equipment busy all the time,” said Krzysztof Szczur, hardware verification products manager at Aldec. “But smaller companies cannot afford to buy enough emulators. The only way they can do this is to use emulation in the cloud.”

Szczur pointed to other advantages besides cost. Cloud-based resources are essentially unlimited. Server space can be ramped up quickly using a fixed configuration of an emulator and an FPGA board. It also doesn’t require a company to maintain those servers. “The customer uploads files to S3 (Amazon’s Simple Storage Service) and they work off emulation that has been pre-installed for design compilation.”


Fig. 1: Amazon Web Services. Source: Amazon

One of the more adventurous attempts to go into the cloud comes from OneSpin Solutions. “Formal verification is well suited to parallel execution,” said David Kelf, OneSpin’s vice president of marketing. “Unlike UVM-based stimulus for simulation, each assertion can operate reasonably independently, which means a group of assertions can be deployed on a series of machines.”

Kelf points out that with cloud-based services “there is no cost difference between parallel and serial operation, but parallel is much faster in wall-clock time. With the cloud you essentially have infinite machine capacity, so with the right business model parallel operation is a breeze. This is a huge win for the cloud that we have observed in practice, particularly in the verification space.”

What’s the holdup?

Initial plans for verification as a service date back at least to the beginning of the millennium. Those plans were overly ambitious for a number of reasons. One involves pricing.

“Initially we had lots of people coming to us saying they wanted to host it externally,” said Wally Rhines, president and CEO of Mentor, a Siemens Business. “We told them what we could do and then we started talking price. They said they wanted one day of Calibre verification. How much would that cost? We’d say, ‘If you rent a car for a day, do you pay the same as if you rent one for a week? And what if you rent a car for a year, what’s that rate?’ We did the same thing and they suddenly concluded that buying a minute of simulation is going to be really expensive compared to buying a year of simulation and dividing it up. All of a sudden the enthusiasm dropped. The whole thing for design automation died out. It’s coming back. We will settle out the pricing. The algorithms more and more will take advantage of the enormous capacity available, and not everyone can afford to have 60,000 servers in their company. But it’s been very slow in design automation compared to a lot of other industries.”

A second holdup was the lack of infrastructure. Moving huge files back and forth and duplicating efforts in third-party data centers was not a compelling model.

“What’s changed is that everyone has figured out how to transfer files back and forth, so it essentially is an extension of their own network,” said Schirrmeister. “It’s a natural extension of what people are used to doing internally. And in some cases, you don’t want a physical device where someone can push the reset button.”

A third hurdle involves chipmakers’ internal policies. Until recently, a number of chipmakers had rules in place that no IP or code could leave the corporate premises. Employees needed to agree in writing to adhere to those rules when they were hired. In cases where those rules could be bent or broken, companies were concerned that putting all of their design data into the cloud was risky, so they only would send off a subset of the design. That required it to be synchronized with the rest of the design data, which was cumbersome, and it defeated the reason why they would use off-site data resources in the first place.

Security has improved significantly since then. OneSpin has spent a considerable amount of effort ensuring that IP is protected, on top of all of the machine security layers. “So why, you might ask, is there still a question mark about IP security?” asks David Kelf, vice president of marketing for OneSpin. “Well, the engineers have to convince non-technical people that IP can’t leak through this mechanism, and the thought of going to senior management, the legal department, etc., and persuading them that this is OK, is just too significant. So the barrier to entry right now is its perception to non-technical people. When this is overcome, as it has in other industries, the cloud could take off because the benefits are so obvious.”

Even if the machine is secure, there are additional ways to protect the data. “OneSpin has developed a mechanism that does not transfer design IP off site,” Kelf said. “This is using properties of the way that formal verification operates. Instead, mathematical proof problems are sent to the cloud.”

Kelf noted that the data is encrypted, as are the results, and that at no time does the complete IP ever leave the local machine. OneSpin believes that it is impossible to ever reconstitute the design using this methodology.

Comfort levels with cloud security have increased, as well. “All of the EDA companies made deals at least five years ago with AWS or one of the other cloud providers so that if customers wanted to take advantage of the use of a farm they could,” said Rhines. “The reasons it didn’t work were many. One was security. At that time companies didn’t want their secure information to go outside their firewall. It amazed me, because they have no problem with their bank accounts or stock broker accounts, but they didn’t want their design information to get out of the firewall. Now they look at AWS as being pretty sophisticated in terms of security, so maybe it’s not such a big worry. But semiconductor companies still don’t want their information getting outside.”

Aldec’s Szczur noted that both the cloud and the connection are now secure, which is helping to reduce those fears. “They now can connect to the cloud directly using a secure VPN,” he said. “And now it’s a lot less expensive than buying licenses. It’s already up and running, and what companies find is that all chips generate a lot of opportunities they didn’t necessarily plan for. This is one of the reasons that smaller and midsize companies need emulation.”

A fourth hurdle involves the legality of using IP in the cloud, a factor that is complicated further by the presence of multiple vendors’ hardware and tools. While companies did not want to discuss this publicly, privately they said these legal concerns are non-trivial and have been difficult to work out in the past. But they also note that progress has been made recently to the point where the cloud-based market is gaining steam.


Fig. 2: Cloud utilization trends. Source: IDG

Conclusion
There are still issues to sort out with EDA in the cloud, but the availability of massive amounts of resources at certain times during the design cycle provide a compelling argument for utilizing these services on an as-needed basis. Whether that is done internally by large companies, or externally in public clouds for smaller companies, is yet to be determined.

“The cloud is the natural location for massive elastic compute required by chip verification,” said Jaushin Lee, founder and CEO Zentera Systems. “But how do you implement a secure environment to host your IP among the other tenants in a multi-tenancy cloud? Once you step outside your enterprise firewall to take advantage of the cloud, the underlying infrastructure is different, leading to potential new vulnerabilities. Companies such as ours are providing automated security infrastructure overlay solutions in the multi-cloud that are enterprise controlled and that lock down your IP with defense-in-depth protections. These next generation solutions let designers focus on chip design rather than IT infrastructure.”

The fact that more companies are jumping into this market provides yet another data point that this approach is finally catching on. Attitudes are changing and much of the infrastructure to make this work has been firmed up. Now the question is how granular the pricing model will become, and whether companies will be able to dynamically add capacity and pay accordingly. That will determine how fast the market grows and whether it will move from interesting concept and steady business to the kind of graph that makes outsiders take note.

—Brian Bailey and Ann Steffora Mutschler contributed to this report.

Related Stories
EDA Moves Out Of The Shadows
This sector is again focusing on applications in much larger systems.
Verification And The IoT
Experts at the Table, part 3: Shifting left, extending right; using machine learning and data mining to find new bugs and open up new usage options
Verification And The IoT
Experts at the Table, part 2: What is good enough, and when do you know you’re there?
Verification And The IoT (Part 1)
Application-specific verification, and why quality may vary from one market to the next; why different models are ready at different times.
Verification Unification (Part 2)
Strategies for using Portable Stimulus to drive formal and simulation, as well as the common ground with coverage.
System-Level Verification Tackles New Role (Part 2)
Panelists discuss mixed requirements for different types of systems, model discontinuity and the needs for common stimulus and debug.



Leave a Reply


(Note: This name will be displayed publicly)