Gap Vs. Gap

Vendors and chipmakers agree there are gaps between automation tools, but they don’t necessarily agree on what needs to be fixed.

popularity

By Ed Sperling
Among tools vendors it’s been standard practice to listen closely to customers but not deliver everything they ask for—or at least not always on the customers’ timetable.

This strategy has worked well enough for both sides in the past, but at 20nm and in stacked die configurations, the level of tension between these two worlds is increasing, and the gaps in the tool chain are becoming more noticeable. Part of the problem is that skyrocketing complexity is forcing more automation, but integration issues, physical effects, process variation and the realities of physics make it more difficult and time-consuming to develop tools to make that complexity more manageable. R&D budgets for EDA companies already are hovering around 30% or more, compared with average R&D investments of about 10% to 20% in other areas of chip development and manufacturing, and betting on the wrong area can have a significant impact on EDA company’s earnings.

The other part of the problem is that chipmakers’ own internal tools are running out of steam at advanced nodes because of the need to bridge both hardware and software design environments and because old methods of doing things are way too slow and very often ineffective. This is clearly reflected in the fortunes of EDA tools vendors, which have been rising steadily for the past couple of years, with the strongest growth in areas such as ESL, including hardware-software co-design and software prototyping, and emulation.

Add in stacking of die, in both 2.5D and 3D configurations, and the number of issues that have to be dealt with by both chipmakers and tools vendors increases by orders of magnitude. On top of that there are double patterning issues at 20nm, finFETs at 14, and potentially 450mm wafers that will require significantly higher yields to be cost-effective, but which may be harder to test in wafer-on-wafer or die-on-wafer configurations.

Where chipmakers see challenges
Riko Radocjic, director of design for silicon initiatives at Qualcomm, breaks the design process down into three areas—design authoring, which is the actual chip design; pathfinding, which includes exploration for how to best build a chip; and tech tuning, which is physical space exploration. Most of the EDA tools have been effective in the design-authoring phase, with some point tools now finding their way into the pathfinding area. But the real challenges are in the tech-tuning area.

Mechanical stress becomes a serious issue in 3D stacks, ranging from the effects of TSVs to die alignment. “You cannot solve the problems with a tool. They have to be solved in a flow,” he said. “It’s a debug nightmare. You need a separate domain that takes external stresses and produces a set of rules. You also need hotspot checking to make sure you have caught all of the interactions.”

Also missing, he said, are EDA tools that understand the materials inside a stacked die, and a standard PDK for all mechanical and thermal properties.

“Thermal is the next frontier,” he said during a presentation at the Electronic Design Process Symposium this month. “You need to manage for hotspots and overall system power. On a global level you have skin temperature and overall system power. On a local level you have to manage hot spots, junction temperatures and power density. And there is also the compounding factor that all advanced systems use some form of thermal management. We need a system-chip co-design methodology and tools to deal with this. We cannot solve thermal issues only at the component level. It must be system and component, and we will need tools for pathfinding thermal issues. We don’t even know where to put our thermal sensors. We need thermally aware floor planning.”

Summing it up, he said it amounts to 3D-aware co-design tools for package, system and thermal, and a flow to integrate everything.

A stacked die of the future; memory on top left, with logic/memory in middle on top and I/O and analog RF blocks on top right. All feed into interposer stack in the middle. Source: Qualcomm.

Altera, which has just developed its first 2.5D stacked FPGA prototype, is encountering similar thermal and mechanical issues. While the company continues to offer scaled down tools for FPGA development, it needs the most advanced tooling for creating those FPGAs in the first place. Topping his list are robust standards for cells, IP and stacked ICs, as well as tools to help quickly identify some of the problems that Altera encountered while developing its prototype stacked die.

Signal integrity issues encountered and addressed by Altera in stacked die using interposers.

“We’re looking for more of a divide and conquer strategy,” said Arif Rahman, product architect at Altera. “Die stacking will be an enabler for a complete solution in the future, but it will not just be an FPGA. It will be an FPGA plus other accompanying functions.”

Where EDA tools vendors see challenges
For the tools vendors, the list of problems that need to be solved is exploding. So many things need to be fixed and solved that it’s imperative just to focus on both what will have the most impact and what will provide the greatest long-term returns.

At least part of that effort involves existing tools, which have to be run faster and do more things than in the past. This is particularly true in areas such as emulation, which in the past were used almost exclusively for hardware. They are now becoming the tool of choice for software verification because the complexity of the software makes it far too slow to run using simulation. What takes hours or days in simulation can be measured in seconds in emulation. And given the fact that verification is still the lion’s share of the NRE, anything that can be done to solve this problem is considered a big win.

Mentor Graphics’ announcement this week of enhancements to its emulation tools is a case in point. Recognizing that software engineers are using the emulation tools as much as the hardware engineers, the company has added a virtualization layer that allows a workstation to be a front end—matching the way software engineers work—rather than doing work in a lab the way hardware engineers typically work.

“This allows one workstation per user,” said Jim Kenney, director of marketing for Mentor’s Emulation Division. “We’ve also been working to improve performance and capacity so you have more robust software execution and debug.”

Mentor isn’t alone in this quest. Cadence has been updating its own emulation, and all the EDA vendors have been racing to improve the reach and integration of their tools. Bassilios Petrakis, product marketing director at Cadence, noted that building smaller die that yield better is still a challenge that needs to be solved—particularly before stacking becomes mainstream.

“When you look at multiple die with TSVs, the cons are that the ecosystem is still emerging, there is no volume production yet and there are thermal issues,” Petrakis said.

Samta Bansal, senior product marketing for SoC Realization at Cadence, predicts that stacking memory on logic using an interposer will become mainstream beginning in 2013 to 2014, with TSVs becoming mainstream by 2015. She said work in EDA typically needs to begin three to four years before these efforts, noting that it began in earnest at Cadence in 2009. Synopsys rolled out its 2.5D tool flow last month and is working on a full 3D flow, and Mentor has been working on a variety of areas ranging from test to modeling of stacked die over the past several years.

But EDA vendors also need to pick new areas for the future, and this is where even the best educated guesses become difficult.

“EDA traditionally has been an industry where big companies acquire small companies doing interesting things,” said Wally Rhines, chairman and CEO of Mentor, noting that markets that have shown strong growth include DFM, formal verification, ESL and power analysis. He said the next wave of electrical design challenges include low-power design at higher levels of abstraction, optimizing embedded software for power, in-circuit emulation, design for test, physical verification, stacked die verification, and system design that extends beyond the PCB.

That concern is echoed by Drew Wingard, chief technology officer at Sonics: “From an EDA perspective, the next layer up in the power hierarchy is how we convince ourselves that the hardware and software are working together correctly. This is a different protocol check than we normally do. You have dependencies, because you can’t turn off one until another turns off. The mix of hardware and software makes it difficult to prove what’s correct. Right now there is not even enough time to test the power management until the second spin.”

He noted that just trying to get software to turn on the power management features in a chip is a challenge. “The thermal/power reduction to be gained by turning on features already in a chip can be significant.”

One issue that almost certainly needs attention is derivative designs. Getting them out the door is painful, expensive, and time-consuming.

“A lot of engineering that’s being done is derivative engineering,” said Naveed Sherwani, president and CEO of Open-Silicon. “This is not something that EDA vendors focus on, but it’s something that’s definitely needed. What’s out there is a kluge of methodologies and flows. EDA so far has not woken up to this opportunity. They certainly listen to their customers, but they’re still not close enough. You have to do the work to understand it, and the revisions and changes that are needed are painful. A derivative is almost like a new project. There can be 1 million degrees of improvement here.”

Conclusions
All of this requires tools—notably more and new capabilities built into existing tools—as well as new tools that can integrate all of these pieces. But what gets addressed first is a difficult balance.

While chipmakers at the leading edge are used to developing some of their own tools, methodologies and dealing with poor yields, their existing development is running out of steam. That means moving forward at advanced nodes and in stacked configurations will require developing entirely new versions of tools, methodologies—an enormous expense by anyone’s calculations.

Qualcomm's proposed tech-tuning flow.

EDA vendors, meanwhile, have their work cut out for them just updating their existing tools, and they are cautious about massive investments in new areas that may not return dividends within an appropriate time frame—or within an immature supply chain when it comes to stacking of die.

“To get ROI back on tools of this complexity you need more than 20 customers,” said Mike Gianfagna, vice president of marketing at Atrenta. “That means you’re going to be negative on that investment for three or four years. So you really have to pick your battles, and small companies probably can’t do this at all.”

Gianfagna noted that for chipmakers the challenge is too many options. “You need a way to prune the solutions space fast. You have to figure out which architectures to choose quickly and which roads to pursue further. The real gap is not in the tech tuning. It’s coming up with the right architecture that supports meaningful decision-making.”

The question now is when the gaps that each side sees will merge, and when it will become profitable enough to take an investment risk.



Leave a Reply


(Note: This name will be displayed publicly)