What Are EDA’s Big Three Thinking?

Messages delivered by all three CEOs to their respective users point to trouble spots, opportunities and what’s ahead for EDA.


Over the past six weeks, the CEOs of Cadence, Synopsys and Mentor Graphics—in that order—have delivered top-down visionary messages to their user groups. Semiconductor Engineering had the opportunity to attend all three sessions, and has compiled comments from each on a variety of subjects. In some cases, all the CEOs were in sync. In others, they were not. In still others, it was difficult to tell because they approach the issues from different angles. But there were some common themes.

Design Cost
Perhaps the biggest divergence came in how to tackle the rising cost issue for semiconductor design.

Lip-Bu Tan, president and CEO of Cadence, viewed the solution as better methodology and better tools, including ways to verify software and reduce time to market—and good business practices. “For first-time success, we need a lot of innovation,” said Tan. “We need smart engineering.”

Tan also said “a lot of innovation” is needed in areas such as verification, hardware-software co-design and in proving that IP will work. He said that with discipline, good methodology and smart engineering, the cost of a new SoC at 16/14nm may be as little as $15million, not the $250 million to $300 million figure floated in some market reports.

Aart de Geus, chairman and co-CEO of Synopsys, was the most bullish on progress to advanced nodes. “FinFETs are absolutely unstoppable at this point,” he said. “There are some questions about whether it will be at a lower cost. It will be at a somewhat lower cost, but not as much as in the past.”

De Geus said that 60% to 70% of the design cycle will be devoted to verification, debug and hardware-software prototyping. But he also noted that the feature-shrink roadmap is clear down to the 7nm node, which he said is technically possible and which will still offer some savings to companies that migrate down to that node—although that is unlikely to happen with 450mm wafer sizes, which could further offset costs.

Wally Rhines, chairman and CEO of Mentor Graphics, offered a different take on the market and cost. While he said some companies will continue shrinking features, that isn’t the most efficient approach after 28nm. He said the only way to achieve significant cost reduction after that is through innovation. That involves new technologies, new packaging and entirely new methodologies, terms that all three CEOs used in their speeches—but there are subtle variations in how each of the CEOs apply those terms.

“If you look at what Samsung is doing with NVM, they’re doubling the number of die in a stack. We’re not far down the learning curve on that, but it’s one way to reduce the cost. You can save on the packaging cost, even though you add some back in, but you don’t have to shrink features. A second way is to integrate photonics and MEMS and get cost per function down. There are also lots of new approaches for switches, with spintronics, new memory technology on the way, nanotubes, and biological approaches…There is lots and lots of innovation for potentially decreasing the cost per effective transistor well into the future.”

Cadence’s Tan said power is one of the better problems to contend with because there are a lot of ways to address it. Synopsys’ de Geus said of the big challenges—power, performance, area and yield—power is “the most restrictive.” But he also agreed with Tan on this point: “The number of low-power techniques is astounding. You can change the voltage, turn off different blocks, and add many circuit technologies to understand when the temporary voltage is off because it doesn’t necessarily come back the way you shut it off.”

And Mentor’s Rhines said the solution requires a bigger shift than that over the long term. One possibility he suggested: biotechnology. DNA storage potentially can offer three orders of magnitude improvement in performance and nine orders of magnitude power reduction.

All CEOs think big when it comes to business. That’s their job. They have to understand the market for their tools and where the opportunities and risks are now and where they will be in the future. The CEOs of the Big Three EDA companies see that from a global perspective. That makes sense because semiconductors are a horizontal mainstay in multiple vertical markets and across many different regions.

Tan broke the markets into the cloud, which he says will be $21 billion in 2018 (with a 25% CAGR); the Internet of Things, which he says will be $8.9 billion by 2018 (with a 35.4% CAGR); communications, $100 billion (9.3% CAGR); computers (7.8% CAGR); consumer, $54 billion (7.6% CAGR) and automotive, $24 billion (7.2% CAGR).

That’s good news for the tools business, because verification grows exponentially, Tan said.

De Geus looked at business from the standpoint of semiconductors. He said the fabs represent about $40 billion in annual revenue; equipment, another $40 billion; semiconductors, $300 billion; applications, “zillions” of dollars; and EDA/IP, $7 billion. He also said the PC and server markets are $72 billion; mobile is $79 billion; and automotive is $20 billion.

Rhines, meanwhile, looked at the market from the standpoint of EDA/IP, which he said was $6.5 billion, or roughly 2% of the semiconductor industry revenue, a rate that has remained constant for decades. He also plotted graphs to show that the assembly, ATE, foundries, semiconductor companies and the companies that buy semiconductors were on consistent cycles. His conclusion is that no one is getting fatter off technology, and reducing gross margins is not possible anywhere in the supply chain—regardless of the growth prospects of individual markets.

Moore’s Law
Possibly the most contentious issue raised by any of the big three CEOs—and it was touched on by all of them—involved Moore’s Law. Cadence’s Tan acquiesced to its continuation, although he stressed that system-in-package and stacked die will play an increasingly important role going forward. Synopsys’ de Geus, meanwhile, sees at least another six years of shrinking features.

Mentor’s Rhines, however, said that Moore’s Law is merely a subset of the learning curve plot that has been used by many industries for more than a century. He said that it worked for the semiconductor industry only so long as innovation made it possible, such as the move to static RAM from bipolar and from vacuum tubes to discrete transistors.

“We will stay on the learning curve, with bio switches and carbon nanotubes,” said Rhines. “But shrinking feature sizes won’t be the best way to get there.”

So what was the takeaway message from the leaders of the largest EDA tools vendors?

For Tan, the key points were ecosystems working together, the need for first-time silicon success, and new approaches such as 3D-IC and hybrid verification involving rapid prototyping, emulation and virtual system platforms.

For de Geus, the keys were IP reuse, the Internet of things opportunity, and the need for tools to make all of this possible.

For Rhines, the message was to think differently about the future and approach it from many different angles, because something will have to be done to offset the cost of multipatterning and the rising cost per transistor—and continue driving semiconductor sales at the rate to which we’ve become accustomed.


Gary Hillman says:

I am most impressed by Rhines desire and ability to think new thoughts to solve continuing problems. The solutions tomorrow will not follow the logic of yesterday

MrChip says:

The problem with being a public EDA company is you get an inherit
conflict of interest between their shareholders and their customers. Is
an EDA company’s job to maximize their own profits or to maximize their
customer’s profits. The two are not the same thing.

Casein point… in maximizing my shareholders profits, my goal is to design
the tool which minimizes my development costs and maximizes my profit.
That means, I do not add features or improve performance which while
they may help my customers, do not add value to my shareholders. At the
point you hit that break even, what happens is you now need to look at
redeploying your resources to places where they add more value to your

For example, I have a good functional simulator, but no
formal verification tool. Even though other formal verification tools
exist on the market and are good, to maximize my shareholders value, I
should take resources off my simulator as long as I maintain margins and
market share, and put them on a tool the industry might not need, but
that bring in incremental revenue and profits to my company.

Look at many of the bigger companies and you’ll see some very strong products
and you’ll see alot of “me too” tools… tools which bring in revenue to
that company and fill out a product portfolio, but that don’t really
add significant value to the industry. Now if they killed off their “me
too” efforts and put those resources back on innovating in their
industry leading tools and truly innovating, that would hurt their
revenue but would truly advance the industry.

I agree right now the industry needs innovation. Unfortunately, academia has scaled back research on EDA in favor of the quick “app” or “big data” research buck. Likewise, EDA companies seem more focused on their own short to medium term profits rather than focusing on the long term health of their industry. (and yes, I’ll throw it back at the semi companies as well who want to nickle and dime EDA vendors to death and make it hard at times to sustain profitable businesses on some accounts)

garydpdx says:

Two interesting points … Rhines: “… EDA/IP, which he said was $6.5 billion, or roughly 2% of the
semiconductor industry revenue, a rate that has remained constant for
decades.” And Tan: “… verification grows exponentially,” so there is a limit to pushing RTL verification. So there is a limit to what the customers can pay while EDA must find ways to maintain cash flow.

If we are constrained to 2% then that means a ‘shift to the left’ (latest buzz term) towards ESL design and verification in order to improve the RTL and then down to the chip. And at a system level, we will need true hw/sw co-design that can quickly reallocate functionality between hardware and software.

[…] What Are EDA’s Big Three Thinking? […]

Leave a Reply

(Note: This name will be displayed publicly)