EDA Cloud Adoption Hits Speed Bumps

What problems are prohibiting semiconductor design moving into the Cloud? Steps are being taken to overcome them.

popularity

If moving semiconductor design to the Cloud was easy and beneficial, everyone would be doing it. But so far, few have done more than dip a toe.

The level of difficulty associated with migrating to the Cloud varies, depending upon who you talk to. The reality is that not everyone makes it as easy as it could be, or is not willing to put the necessary effort into making it easier. There is certainly a chicken-and-egg problem.

Last month’s article, “Is Cloud Computing Suitable For Chip Design,” looked at issues associated with performance, costs and hardware adaptability. Companies are seeing varying benefits, often based on their workloads.

What other road bumps stand in the way of companies giving it a limited test run and how much time should they expect to spend getting ready for a migration?

Cloud management
In most cases, you do not want to lock yourself into a specific Cloud provider. “We are multi-Cloud,” says Naidu Annamaneni, vice president of global IT for eSilicon. “That means we want to be able to run on any service. We want that flexibility.”

Assuming this is true for most companies, there is a major decision that needs to be made—who is going to manage your Cloud capability? There are two possibilities, either do it yourself or have a tool vendor manage it for you. “We can support customers who want to manage their own environments and that entails a certain list of activities and products,” explains Craig Johnson, vice president of cloud business development at Cadence. “The other side of that are customers who want Cadence to help them manage their Cloud environments. If a customer wants to do Cloud enablement themselves, build the IT connections and figure out if they need to build some additional software or purchase third-party software to make it all work, we support that. It requires a combination of contractual mechanisms as well as access to tools that we have deemed to be Cloud-ready through testing and enhancements and we include a license server that can run in the Cloud for higher reliability.”


Fig 1. The Cadence Cloud Portfolio. Source: Cadence.

How does one choose? “If you do it yourself, that is like fixing your own car or planning your own finance,” says Kirvy Teo, COO for Plunify. “You would need to read up on everything, have all of the skill sets etc. It can be very complex, depending upon deployment. The other option is you outsource that to another company, where we maintain all of the software, the nitty gritty of IT. This what is meant by a managed Cloud. You simply transfer the problem from on-premise to the Cloud. We don’t call them CAD managers or IT people – we call them DevOps – Development and Operations – a hybrid of software development and software operation.”

Depending upon the range of tools that you want to run in the Cloud and the vendors who supply those tools, this decision may be a lot more complex than it should be. Plunify has indicated that it is willing to host tools from other vendors, but it may take a truly independent company before most tool vendors would be willing to have their tools managed by someone other than themselves or their customers. Until then, having your complete flow in a managed environment may be difficult.

This also means that most people will have to consider a hybrid environment. “Hybrid is difficult,” says Johnson. “It is largely a data coherency challenge. It is not that it can’t be done. But customers have to be fairly surgical about where and when in the flow they will step out of their on-prem environment and make sure that all of the necessary data, libraries, files, debug environments – all of those things are there. If this is not done carefully, they will transport data back and forth very frequently and end up risking having data that is not completely up-to-date.”

Important infrastructure
Data can be the root of all problems. “One challenge in deploying a hybrid cloud environment is the difference in cloud and on-premise infrastructures,” explains Shiv Sikand, executive vice president at IC Manage. “For example, the cloud typically uses a block storage mechanism, meaning that the data is accessed by only one host at a time. In contrast, EDA tool workflows require an NFS filer to ensure the data can be readily accessed across thousands of nodes. NFS filers are not generally available in the cloud as a native instance, and using an external filer is both expensive and slow. We already see this today in our on-premise environments. Our filers are really struggling to keep up and buying more filers isn’t really helping us.”

eSilicon’s Annamaneni agrees. “For high-performance computing, the critical thing is NFS. This is something that EDA tools assume and require. None of the Cloud vendors had a viable NFS solution. When you are doing a big design, the project file size is 500TB-700TB. On any given day, we were producing 100TB to 200TB of change. With such a massive scale, the Cloud providers were not ready and we were scared. We partnered with Elastifile and they claimed to be able to support us, but when you run these massive work loads we initially found a number of issues with their system. They helped us to work around that. So that is how we were able to run the first time on the Cloud.”

The Cloud works on data objects. But a software layer, such as the one from Elastifile, makes that look like an NFS file system. “When data does need to come back from cloud, there are optimizations that can reduce the amount of data transferred,” says Jerome McFarland, vice president of marketing for Elastifile. “For example, we have a data mobility feature that essentially encodes on-premises file system data into objects (so into S3, GCS, or Azure blobs) and extracts that data from objects back into file format, retaining all the file system hierarchy, links, metadata, etc. When shifting data to and from the cloud, that data will be compress and deduplicated before transfer.”

Another aspect of keeping all of the data synchronized is a distributed revision control system. “In our case that is GIT based,” says Doug Letcher, president for Metrics. “This is a hosted service in the Cloud and is how you get your code in an out. That makes it quick and simple. GIT enables this to be done securely and tracking what is being transferred. GIT is also a defacto standard in this space and we would recommend as a best practice.”

Many users are looking for a more Agile-like environment, leveraging ideas from the software world such as continuous integration. “Although for hardware design there are some gotchas that the software folks do not have to be so concerned with, a lot can be borrowed from software methodologies,” says Dave Kelf, CMO for Breker Verification Systems. “Git type repositories such as GitHub enable continuous integration workflows by allowing for a very fast, automated regression turnaround on checked in code, with the notion that agile teams want to check in small code segments often and need immediate feedback.”

No matter if all of your flow or just a single tool is in the Cloud, these issues have to be addressed. “They all share the same set of challenges,” asserts Johnson. “Do I have the right data where I am going to run my jobs, do my engineers have access to the environment and the information when they need it and what are the cost implications and efficiency implications and performance implications of where I run and how much I run.”

Security
If there is one question that everyone dreads in a Cloud discussion, it’s security and the availability of IP. While the situation may be improving, answers remain all over the map. “Every company starts with that level of concern and how to protect IP,” says Michael White, director of product marketing, Mentor, a Siemens Business. “The next layer of concern is that some of the foundries have been very specific in saying, ‘I will not allow my DRC decks and other decks to be used in the Cloud.’ They are not convinced the Cloud has sufficient protection to keep their decks from being seen by competitors.”

But change is in the air. “The timing of our announcement of a portfolio of Cloud ready products was really done because we could see there had been a material change in the mindset of more customers about their willingness to even entertain putting their own IP into the Cloud,” says Johnson. “The first question that customers ask is, ‘Is it secure?'”

Security is often the biggest concern for user companies considering cloud-based EDA solutions, says Tom Anderson, technical marketing consultant for OneSpin Solutions. “Although sales, finance and HR systems have been widely used in the Cloud for years, some engineers still resist uploading their RTL, testbenches, formal verification environments, and synthesis scripts. Therefore, the first thing a company should do is to assess carefully the level of security provided, whether in their own clouds, EDA clouds, or in public clouds such as Amazon Web Services. Engineers understand this issue and need to be comfortable with any solution.”

Jean-Marie Brunet, director of marketing for emulation at Mentor, a Siemens Business, says we should look at the Mil/Aero industry. “They are very organized about measuring security. When you are professional about looking at security requirements, you work with the Cloud providers and you quickly realize that their security environment, while not perfect, is appropriate enough for those types of customers to make the decision to work with them. When you have that level of maturity and understanding of security from an IT perspective you select Cloud. Other companies are less sophisticated and have less knowledge about their own security, and so there is more fear. Then you add the topic about how to move IP – I don’t think it is a problem.”

Where is the industry with IP availability? “IP always has site restrictions that may say you can only run it within five miles of your office,” says Plunify’s Teo. “This makes it impractical for obvious reasons because none of us own the Cloud. There is the need for IP on there before we see the light at the end of the tunnel. It requires an adjustment in the IP protection clauses.”

And those are coming. Just recently, TSMC announced support for the Cloud and the initial availability of its Open Innovation Platform (OIP) Virtual Design Environment (VDE), which it says will enable semiconductor customers to securely design in the cloud, leveraging TSMC design infrastructures within the flexibility of cloud infrastructures.

It is an issue that has to roll through the system. “Historically, Cloud was not part of the discussion or contract,” says Johnson. “If a customer comes to us and wants to talk about the Cloud, that happens on a customer-by-customer basis. We would have to convince ourselves about the same things, such as will the IP be handled appropriately, that we know what happens if someone loses it or causes it to be lost to a third party. These are a combination of contractual things that have to be in place, as well as the security practices that need to be followed.”

There are other ways around the problem, too. “Virtual machines and licensing were issues because you need to have the IP associated with a MAC address,” says Annamaneni. “We use Google cloud through our virtual VPN, so it appears as an extension of our data center. Our engineers don’t even have any access to that. They just submit their jobs to our grid, and then it is automated to run on the Cloud. The security concerns are all addressed. We do not compromise anyone’s IP because we treat the Cloud as an extension to our datacenter using a direct link between our datacenter to their Cloud.”

This is an approach some vendors have taken as well. “Caliber works in a distributed type of environment so we didn’t need to do anything new to the tool,” says Mentor’s White. “The project was about how to properly configure a VPN environment into AWS for the runs that we wanted to do. That took a little time. Today, that it is a relatively modest project for any company so that they can VPN with their current software and licenses that they own into the Cloud and most companies will set up the various shells that they need to be able to launch into the Cloud. It is a project, but not a big one – measured in days or a couple of weeks. It is a one-time thing.”

Planning the migration
The migration has to be planned. “The dilemma of being 100% on-prem or 100% in the Cloud is pretty straightforward,” says Johnson. “Hybrid, cutting back and forth for different projects and products during a design is much more complex. A lot of conversations with customers is helping them to think through where they want to be on that spectrum.”

The problem comes down to cost and bandwidth. “Sometimes, the difficulty is the time required to transfer very large files,” adds Johnson. “The Cloud is great at movie-sized streaming, but those files are tiny compared to a leading node design file for timing or power.”

Others agree. “You need to take the time to effectively move your design methodologies to be able to play in a system like this,” says Brunet. “Companies that are very market-timing centric don’t have the time to do the necessary changes. The ones that have moved effectively are the ones who have taken the time to do it right. Most design methodologies have evolved over a number of years to suit their existing environment. Many companies moved to off-premise data centers, dedicated data centers five to seven years ago. That move was painful and this is a similar move.”

There is still no one size fits all. “If you are a large company and have the wherewithal to buy and manage hardware, meaning you have a reasonable IT team, you have a core amount of computation that you need to do each and every day justifying that number of heads to support it, and to own that hardware – it is more cost effective for you to own and manage internal hardware,” says White. “There is a cross-over point where your compute needs are more lumpy over time or you are small and have three or four people and you just don’t can’t justify having an IT person, and installing a bunch of servers in your company, then you are the other side of the cross-over point.”

Most companies remain in between those two extremes, where the tradeoff may not be quite so easy and the more complex hybrid solution is the only option available. The industry is catching up that reality, but it may still be a while before the migration is easy.

Related Stories
Is Cloud Computing Suitable For Chip Design?
Semiconductor design lags behind other industries in adopting the cloud, but there could be some good reasons for that. Change is difficult.
EDA In The Cloud (Part 3)
Experts at the Table, part 3: Which companies will use the cloud and for what, what else needs to be done to accelerate this model, and why it’s taking so long.



Leave a Reply


(Note: This name will be displayed publicly)