EDA In The Cloud (Part 3)

Experts at the Table, part 3: Which companies will use the cloud and for what, what else needs to be done to accelerate this model, and why it’s taking so long.

popularity

Semiconductor Engineering sat down to discuss the migration of EDA tools into the Cloud with Arvind Vel, director of product management at ANSYS; Michal Siwinski, vice president of product management at Cadence; Richard Paw, product marketing manager at DellEMC, Gordon Allan, product manager at Mentor, a Siemens Business; Doug Letcher, president and CEO of Metrics, Tom Anderson, technical marketing consultant for OneSpin Solutions; and Kirvy Teo, vice president of business development at Plunify. What follows are excerpts of that conversation. To view part one, click here. Part two is here.

SE: There are engineering teams that are not in the Cloud. How do they get there? What are the steps, or is it a leap of faith?

Paw: That is dependent upon how their design environment is set up. The guys who have IT folks, who are more familiar with what the Cloud can do and have experience with how tools interact with the infrastructure, are the guys that will be able to make that transition easier. Today, there is still a lot of DIY that has to happen. As tools and flows evolve, they will just get carried along. But it depends if you have IT guys that know what to do. There are new tools coming out that have a lot of the necessary stuff built into them, but traditional tools still require some knowledge to be able to port them over.

Letcher: There is an opportunity for those who make it easier for people, particularly for companies that don’t have an internal CAD team.

Siwinski: That comes back to the three types of customers. There are those who have no infrastructure. For them, it is very simple because building an infrastructure is expensive and is not an option. This includes startups and small companies. For customers that already have some level of Cloud usage, which means just about everyone, the IT organizations already have some exposure, so there is somebody who can make it applicable to electronic design. It’s just a matter of connecting the right people and making it work. The third kind of company are the ones who are just about to get there, and they need to make the leap mentally. Then it is simple.

Teo: Recently, we met with a large Japanese customer. They had an agreement with AWS to do some of their work in the Cloud. The key feedback we got was it is not easy. The tools are not built to just move into the Cloud. So even for customers who have accepted the Cloud, you still need tools that make it easy.

Siwinski: We all need to deliver Cloud-ready and Cloud-certified products. Different architecture, different validation, different security checks, different licensing – you have to do all of that before a product is considered to be Cloud ready. The onus is on all of us to provide these Cloud ready products and flows versus just letting the customer figure it out on their own.

Vel: There are third-party Cloud providers, which is where you have the AWSes and Azures of the world, and then there is the entire third-party infrastructure, which includes companies that will come in and enable your EDA tools to work on AWS. They can potentially allow the Cloud to talk with a customer through a platform. These are for the people who have no IT infrastructure.

Paw: Even large companies may use them. EDA tools were not designed to work in the Cloud. The Cloud is more focused on web-based applications. EDA tools are built to work inside a datacenter. They are different architectures. There are consultants who have taken the time to understand them and know how to make it work. At least for a short period of time, there will be value in those guys helping you make the transition. That will help until all of the tools get to the point where they naturally work in the Cloud.

Vel: Even some of the very large companies, which say they want to go to the Cloud and have dedicated IT teams, are still struggling with what it really means to have that Cloud burst and Cloud compute and run the tools – all the way from making sure the licensing agreement are in place, IP protection, etc. They are asking us how they go about doing it.

Siwinski: The moment you realize that the Cloud is not an end in itself, but a means to an end, it becomes easier. You just have to make the leap. Most of our customers have or are about to do it.

Paw: But why do you want to go there? Why are you trying to get there? Is it to be in the Cloud, or is because you have a specific need. Not everyone will be on the Cloud. I suspect that the vast majority will have something on the Cloud. But depending on size, needs and paranoia level, how much you put in the Cloud will vary.

Teo: Things such as quality of results, which require deep analysis, are what you do in the Cloud. Simulation and verification flows have been around for a long time and people know that the more cores they use, the faster it goes. But you need an extra push. Just uploading compilation is nice, but it’s not cool. If you can use the Cloud to predict something, predict problems, then it becomes cool. You have a selling point. While FPGAs are a much smaller scale, the same issues apply. You don’t use the Cloud for brute force, but you want to do it better.

Letcher: Ultimately it is about making sure that your people are more productive. Verification is a bottleneck and takes up not just the most compute resources, but also a lot of human resources, a lot of time to market. They want to reduce that and reduce the resources. The Cloud can do things beyond just being more compute. It can provide a more collaborating work environment. And yes, you can apply machine learning and Big Data, as well, but even at a simpler level the Cloud can provide a better work environment and provide visibility into the project.

Siwinski: The Cloud becomes this flexible means to an end, which helps to solve a problem versus this scary thing that has to be afraid of.

SE: What should EDA companies be doing to make the transition to the Cloud easier?

Siwinski: We already have mentioned some of them. Make sure they are architected so that they are Cloud-ready. This means different things to different companies, but it means you understand the architecture you want to run on, you understand the architecture of your product so that it is not a legacy architecture, and then you understand those implications. How will it communicate? How will it deal with latency and security? Those are real workload issues. We need to make sure that all of the tools comprehend that as we develop them, and then as an industry we will move into a standardized nomenclature on the technology side and on the business terms. Then it becomes just one more thing that we do.

Paw: You mentioned that there are EDA companies moving forward in the business model and licensing has been one of the biggest problems in the past. How do you license it, how do you charge for it? But you also can start to look at things like how data is handled. When you put data into the Cloud it is relatively cheap. When you expand it in the Cloud it is relatively cheap. When you pull data out of the Cloud it is really expensive. So how do you design a workflow where you don’t have to pull much data back.

Letcher: For some tools, you can view that data, in place, using a remote mechanism.

Paw: These are things that are outside of the normal technology issues that you think about, but they have to be taken into account. When data is inside the datacenter, you can access anything you want, but when it is in the Cloud, pulling that data back gets to be expensive.

Siwinski: Like you were saying, since customers already had this hybrid mindset, the good news is that most EDA suppliers already knew the situation – even before we were talking about Cloud, our customers already had private clouds and were using third-party suppliers. We may not have known about them, but it was happening, so we were seeing that as requirements.

Paw: Right – it has been happening for 15 years.

Letcher: This is another point where verification fits very well. For the typical regression, you push a small amount of code into the Cloud, you run a lot of regression simulations, and then you have a few fails that you may need to bring back, or you can even view those in place and just bring back a small amount of data. Hopefully 99% of the stuff passed and you can ignore it.

Paw: If it passes you don’t care.

Letcher: Right – you just need the data that they passed and not the results.

Anderson: Even for failures, you want to increase the amount of analytics that you do in the Cloud.

Paw: That is the next step beyond. The first step is the way that the tools behave.

Siwinski: The tools and specific use models. Some use models would be amiable to a Cloud usage, but some use models not as much.

Paw: Those that involve pulling down GDS II files.

Siwinski: Exactly. As EDA and Cloud supplier, we need to do a better job making sure our customers understand that. Right now, not knowing creates a certain level of concerns that are not warranted. We have the knowledge and we need to share it as an industry.

Allan: It is a natural progression for the technology industry to think about build versus buy, but think further and think build versus buy versus rent. If you want to simulate a city, imagine renting something as big as that for a short period of time to get the answer that you need. If we enable our tools to participate in that model — OpEx rather than CapEx — then we can change the economics.

Letcher: If you are looking at a datacenter that has 100,000 nodes, why is there such a thing as queueing up for simulation and having resources or humans waiting for data. And there are 50 of those data centers. Emulation also seems to be moving to the Cloud.

Siwinski: It is a different way to deal with the verification challenge. It is not about the technology, but the problem you are trying to fix. All of it is part of the same challenge.

Allan: Imagine a Cloud where you upload your design and you don’t know if it was running on a software simulator or an emulator. With the Cloud, we get to explore new platforms. We are developing support for the Arm architecture, and it has a favorable memory bandwidth suited for simulation. That presents opportunities in the Cloud for new platforms available to companies who would not invest the capital.



1 comments

Sean Murphy says:

Metrics is based in Canada, so its URL may be difficult to find.

Leave a Reply


(Note: This name will be displayed publicly)