EDA’s Hedge Plays

Torn between more complexity and business at older nodes, the industry is subtly changing course.

popularity

While 14/16nm process technologies with finFETs and double patterning have pushed complexity to new heights, the move to 10nm fundamentally will change a number of very basic elements of the design through manufacturing flow—and EDA vendors will be caught in the middle of having to make hard choices between foundries, processes, packaging approaches, and potentially which markets to serve.

For EDA vendors, this presents both opportunities and risks, and strategies that have begun unfolding over the past year reflect these changes. On one hand, tools are being scaled for better performance using multiple cores, multithreading and better integration with other tools. On the other hand, they are being retrofitted to do more at older nodes. And in conjunction with these changes, EDA vendors with deep enough pockets are expanding into multiple adjacent and not-so-adjacent markets as a future growth strategy and a hedge against disruption in any particular segment.

From an R&D perspective, this is almost imperative. Between 2011 and 2013—one tick on the Moore’s Law road map—R&D as a percentage of revenue increased for all of the Big Three EDA companies. For Synopsys, it rose to 34% from 32%; for Mentor Graphics, it increased to 36% from 31%; and for Cadence it increased to 37% from 35%. In aggregate, the three companies last year spent $1.55 billion on R&D, according to public documents. Private tools companies also report increases, although statistics are not public.

Hedging strategies
For the big EDA companies, what’s becoming critical is the ability to continue scaling as the problems become more difficult. At the very least, this requires parallelism and the ability to take advantage of multiple cores or more powerful hardware, which is why all of the Big Three EDA vendors now offer emulation in addition to simulation. But they also are expanding their capacity to include more gates per cycle using older tools.

Consider the most recent announcement by Cadence, which ties the less-expensive FPGA-based rapid prototyping to work in conjunction with its more expensive emulation platform.

“We’ve been working on improving a suite of engines,” said Frank Schirrmeister, group director of product marketing of the System Development Suite at Cadence. “What we’re doing is system-level validation with more and more software earlier. This isn’t just parallelism. The choke point is how to get RTL clean and to know which cases and corners need to be clean. This is another way of giving users the ability to run more cycles to address that.”

This is more than just a pitch to the most advanced process nodes, though. With increasing interest of using established process nodes, the EDA vendors are positioning themselves as providers of tools that can increase reliability with simpler software testing at older nodes.

“A lot of people are not going in that direction anymore,” said Kurt Shuler, vice president of marketing at Arteris. “They’re doing logic on a legacy process and making it higher-performing. If you look at an x86 processer, there are gates in there that were designed in the early 1980s. And with phone modems, some stuff dates back to the early 1990s. There are better ways to do that now with better functionality. This is like refactoring software to be smaller, run faster and use less power. The same will happen with hardware.”

The best processes
For EDA vendors, this is only good news because it’s getting very expensive to keep up with the latest process technology. Unlike in the past, when processes were relatively generic, new processes are a tight-knit development process between EDA vendors, foundries and large customers. One EDA vendor said it had more than 20 different programs under way with a leading foundry, including 3D-ICs.

But it’s also taking longer to develop those processes, which means there are more iterations per process—it may start on version 0.1 rather than version 1.0. That significantly increases the R&D costs. Moreover, what gets developed for one foundry is not the same as what gets developed for another foundry.

In addition, part of what has made EDA tools so effective in the past is that they are used on so many different types of designs that it’s easier for the EDA vendors to pinpoint problems, quickly fix them, and add more innovations in future releases.

“The big impact on EDA is that there are fewer producers, bigger wafers, and not as much diversity,” said Wally Rhines, chairman and CEO of Mentor Graphics. “As an industry you want small and midsize foundries and a more diverse industry. That provides more opportunities for EDA.”

Rolling back to established nodes provides exactly that kind of diversity, and it dovetails with the burgeoning Internet of Things as well as the push toward 2.5D and 3D stacked die. And as chipmakers begin improving designs at older nodes, that will only increase the demand for existing tools and new innovation at well-tested process geometries where yield is high, development costs are relatively low, and the amount of retrofitting needed to make tools more expensive is a fraction of the cost of R&D at the most advanced nodes.

That doesn’t mean work won’t continue at the most advanced nodes. But that work also has to be extensible backward these days.

“We still have to do stuff for 10nm and 7nm, and we’re working closely with the foundries on that,” said KT Moore, senior group director for Cadence’s Digital and Signoff Group. “But this is not a one-time thing. We have to continue to bring performance at the right place. We have to improve signoff on all corners and modes, and we need to improve signoff even on older nodes. With better performance and capacity you can do more testing in the same amount of time.”

Synopsys’ rollout earlier this year of a new version of its GDSII place and route tool is another case in point. Chairman and co-CEO Aart de Geus called it the most significant development effort in the company’s history because it improved throughput across all process nodes by a factor of 10 compared with the previous generation of the tool—which Synopsys continues to sell.

New paths
But it’s also clear that no one is banking just on existing markets to continue their top-line growth. Synopsys bought Coverity earlier this year, jumping into software development tools. Cadence has been ramping up its IP business and, with the acquisition of Jasper Design Automation, positioning itself to provide security tools for the Internet of things. And Mentor Graphics, which has always been the most diverse, is pushing heavily into software, computational fluid dynamics and automotive wiring. Likewise, ANSYS’ purchase of Apache Design added a distinctly semiconductor-focused tools capability to its mechanical engineering expertise.

From a portfolio perspective, all of these moves are well outside of the classic EDA relationships. But as the industry continues to shift, and more parts and markets intersect EDA, there is a potentially huge opportunity that offsets the risk of banking everything on the continuation of Moore’s Law. It also potentially builds enough momentum—and money in the bank—for EDA to invest in closing up up some of the more glaring gaps in what tools can do today. EDA vendors have received high praise for keeping their existing tools current, but they have also been criticized for not investing enough in areas such as pathfinding.

“There’s a lot of room for improvement in ensuring design intent of the architecture all the way to the polygons,” said Mike Gianfagna, vice president of marketing at eSilicon. “Then if you move to the architectural level, there’s room for both new standards and an appropriate level of accuracy. Those are both opportunities, and moving up the stack drives opportunity for a lot of companies.”

He said verification and place and route have received a lot of attention—and made huge improvements—but tools still need to go vertical in the software stack to manage them hierarchically.