EDA and the cloud mix slightly better than oil and water, but could the combination provide a better business model for both sides of the equation?
There was a time, not that long ago, when chip design and EDA tools consumed some of the largest data centers with tens of thousands of machines and single datasets that consumed more than a hard disk could hold. The existing IT capabilities of the times were stretched to their limits. But while design sizes grew, other aspects of the flow did not develop as fast.
“This has been driven by the fact that CPU scaling stopped and many of the algorithms and tools available from EDA are not sufficiently multi-threaded, said , vice president of engineering at IC Manage. “They are very old, sequential architectures. Some are trying to improve, and you see it in certain tools, but most are not able to take advantage of multi-core or multi-processor.”
At the Design Automation Conference in 2013, Leon Stok from the IBM Systems and Technology Group, gave a talk about the trends and challenges in design and the EDA industry with special emphasis on the unique opportunities in cloud computing. Stok said the size of EDA and design problems is no longer challenging the industry. He computes that a 10 billion-plus transistor processor in a 22nm technology consumes about 12TB of unique data and compares that to Google Maps (perhaps a more static layout), which contains 20 petabytes of data with an average update frequency of two weeks. Stok concludes that “EDA tools need to be constructed such that the data is central in the EDA flow, not the optimization and analysis algorithms. Cloud-based EDA tool installations will be significantly easier to manage and result in much more robust and predictable design flows.”
, chief technology advisor at Cadence, doesn’t see that EDA has been left behind. “IC design has always been on the leading edge of high-performance computing and large data sets. It should be no surprise then that we are seeing activity around cloud and big data computing in the EDA space. We anticipate that over time more and more of the general capabilities and services found in the cloud and big data domains will find their way into semiconductor and electronic system design.”
There are multiple reasons for the lack of progress in adopting new IT infrastructures, though. “The rest of the IT industry has embraced new architectures,” says , chief executive officer for Silicon Cloud. “One is the use of virtual machines and the other is cloud computing. While not necessarily the same, they often go together. Virtual machines are not used by semiconductor companies. I believe that 99% of semiconductor designs are still done on bare-metal machines. This technology has not been embraced by the users of EDA tools, and so the providers have no incentive to provide support.”
When , co-CEO of Synopsys, was asked about this he said virtualization does not make sense for EDA. He noted that for many tools, the relationship between the processor and memory is very tight and access patterns are random, meaning that the costs imposed by virtualization would be large without a significant benefit.
While there is inertia in the system, the attitude toward the cloud is beginning to change. “The costs of deploying 10,000 cores to perform a verification run is becoming cost-prohibitive, especially if they are only needed for a month or two,” says Les Spruiell, applications engineering and security manager for Zentera Systems. “Thus, on-demand capability in the cloud is getting more attractive as they overcome their security fears.”
Costs may be the key to bringing about change. “There is a disconnect between IT and engineering,” says Sikand. “IT is run by the chief bean counter and engineering is run by the vice president of engineering. The VP of engineering does not want to own IT. So the CIO has to take responsibility.” Sikand believes this is a significant part of the reason why EDA costs and margins have been squeezed and that moving to the cloud will be good for EDA. “When the big semiconductor houses realize that they trampled EDA into the ground, then the money will become available again.”
There are many factors that have to be considered when looking at EDA in the cloud, including security, business models, licensing, liability, performance, geography and many others. The interactions between these are complicated and too large a subject for a single article. This article will focus on the applications suitable for the cloud and the business case.
What fits in the cloud
Not all EDA tools have a natural fit with the cloud, but different usage models can help. As an example, the logic simulator failed to make effective use of the migration from single core to multicore. This is because of the memory access patterns exhibited by a simulator and the need to ensure that each run is fully deterministic.
Improvements have been made and simulators continue to become more optimized, but they are not much faster today than they were 10 years ago. As a result, they have failed to keep up with increasing design sizes. While each instance of a simulator may not go faster, parallelization can still provide gains. , CEO of Runtime DA, confirms that “performance improvements in tools recently have been based more on parallelism than on new algorithms.”
Access to more machines means that increasing numbers of simulations can be run in parallel, but only so long as each has a dedicated memory space. Then the total amount of simulation that can be performed increases linearly. Lianfeng Yang, vice president of marketing for ProPlus Design Solutions, sees one such use for simulation. “Statistical circuit simulations can be run in parallel on thousands of CPUs,” he says.
The downside is this is not how cloud compute services are put together. In the cloud, multiple cores share a single memory space, and virtualization adds separation between the core and the memory, both resulting in reduced performance. But maybe there are other ways to do this given the introduction of the IBM tool suite on its own infrastructure.
In the first phase of the launch of an end-to-end design flow, IBM will be delivering three tools, IBM Library Characterization, to create abstract electrical and timing models required by chip design tools and methodologies; IBM Logic Verification; and IBM SPICE. The services are executed on an IBM Platform LSF cluster built using the IBM SoftLayer cloud.
Another recent example of an EDA tool becoming parallelized is Cadence’s RTL synthesis – Genus which employs distributed processing. According to Cadence this enables a user to explore more ways of creating area- and power-efficient logic blocks. They say that Genus incorporates a multi-level massively parallel architecture and that the tool performs timing-driven distributed synthesis of a design across multiple cores and machines.
One of the more adventurous attempts to go into the cloud comes from OneSpin Solutions. “Formal verification is well suited to parallel execution,” says David Kelf, vice president of marketing for OneSpin. “Unlike UVM-based stimulus for simulation, each assertion can operate reasonably independently, which means a group of assertions can be deployed on a series of machines.”
OneSpin also tackled the security issue by making sure that the design cannot be reconstructed. Kelf explains that “design security is assured because only a mathematical abstraction of the verification problem is transferred to the cloud, with all design detail stripped out.” Security will be explored in more detail in an upcoming article.
Changing the problem
The application space is changing and not all usage of the cloud is running traditional tools. There are more applications that look like Big Data problems ranging from system-level optimization, debug, verification completion and many others. This is about scanning the data produced from tools and finding more information from it than just the primary purpose as is done today.
As an example, a simulation trace has many possible applications such as bug hunting, assertion extraction, coverage extraction and many more. “Cadence’s new Indago Debug Platform was developed with Big Data capture techniques,” says Ready. “This raises the level of debug abstraction to reduce the time to identify bugs.”
Mathworks recently released a new version of MATLAB incorporating Big Data interfaces and incorporates DataStores, MapReduce and Hadoop. The company did this in response to the increasing size of datasets. While this is not likely to be electronic design data, the industry is seeing similar efforts in the areas of bug triage and development progress tracking.
IP evaluation
A third application area is related to IP evaluation. “Some IP providers are interested in the opportunities the cloud brings as a marketplace,” says Chian. “Selling IP requires a lot of trust. If you are a small IP provider and a company from China says they are interested in your IP, and if you claim yours consumes 20% less power, they may want proof and try it out before buying. With our system, they can try the IP but cannot download it. They can run simulations, check out the timing, or the power before committing to the purchase.”
Areas adjacent to EDA are also seeing interest in the cloud. Darrell Teegarden, mechatronics product manager for SLE SystemVision.com at Mentor explains that “systemvision.com is a cloud-based electronic design web site that provides capabilities for the design and verification of hardware, software, and mechatronics, including powerful modeling, design, and analysis tools. Reference designs that have already been vetted by a component supplier, along with tools that help select parts and configure these designs, are an engineer’s best friend.”
A question of balance
EDA tools are expensive and smaller companies may feel they do not get enough utilization from a tool to make the purchase worth it. They require significant amounts of tool usage during certain parts of their flow, and the rest of the time it sits idle. Larger companies also have problems, although somewhat different. At peak times they may want many more licenses available, but they cannot justify it based on average utilization rates.
Kelf points out that with cloud-based services “there is no cost difference between parallel and serial operation, but parallel is much faster in wall-clock time. With the cloud you essentially have infinite machine capacity, so with the right business model parallel operation is a breeze. This is a huge win for the cloud that we have observed in practice, particularly in the verification space.”
The IBM cloud service supports a pay-as-you-go model. IBM makes the point that with the cloud service, clients no longer need to purchase EDA tool licenses, new hardware, data center infrastructure or staff to manage on-premise environments. It claims this will provide improved price/performance, offering customers of all sizes more affordable access to EDA tools and decreased cost of designs.
“Smaller companies will see a greater positive impact because they do not have the large IT departments and support teams,” says Chian. “Large companies spend about 2% of their revenue on EDA tools and another 1.5% of revenue on infrastructure. This includes the machines, the CAD group, putting together design flows and scripts, tool integration, etc.” Chian explains that for large companies this is a fixed cost that gets amortized over many designs, but in small companies they still have the fixed cost but they have fewer designs, so for them it is no longer 1.5% of revenue but could be 10% or more. “This is a significant impediment for startups and we believe the cloud can help provide a solution in this space.”
New markets, such as edge devices also may be a game changer. “Unlike building an application processors and spending $200 million to $300 million building the device, taking two years, and using the latest 14nm processes, the IoT devices are simple, low-cost, use two to three technology nodes behind the leading edge, and are also analog and mixed-signal,” Chian says. “When you put these together you don’t needs $100 million to build a chip. The budget is more like $500,000 from beginning to end and including mask costs. This does not allow you to spend a lot on design infrastructure and it needs to be flexible and pay per use rather than being a fixed cost.”
Hosting
Two of the big three have offered a version of the cloud to their customers in the past. Synopsys was early with an offering in 2011 that made VCS available through Amazon Web Services, but it has made little noise about this since the early announcement indicating a lack of interest in the solution.
Cadence may have been a little more successful with its Hosted Design Services. Cadence’s Ready describes it as being “for smaller companies, or individual groups within a large company, allowing a customer to leave the IT and tool configurations to Cadence, allowing them to focus on their design work.”
It would seem that solutions hosted by the EDA companies are unlikely to gain too much traction given that very few companies buy all of their tools from one vendor. However, it may provide them with the confidence to allow independent companies to host complete flows.
There is still reluctance in the industry on both sides of the fence, but given that there is additional investment into the idea and increasing talk about the issues, we may be close to the point where it will become a reality for a new generation of semiconductor design companies.
Are EDA and semiconductor missing something? The rest of the industry was ready to pony up $13B a year as of 12 months ago at a growth rate approaching 45%, and IBM reported $7B in revenue from cloud services in 2014, up 60% from the prior year. IBM Cloud Services is approaching the size of the total EDA market. Maybe IBM knows what it is doing by offering its EDA solutions on its Web platform, but it is still not a flow, and it has tried making its tools available in the past without success.
Jai Iyer, chief executive officer of SiCAD paints a very rosy future. “A time-based usage model on a need basis makes sense for this industry and will spur innovation in the industry while lowering capital and operations expenses.”
Leave a Reply