Boosting Yield With Layout Awareness

The combination of physical design data with test diagnostics engines may be the light at the end of the tunnel to allow for easier pinpointing of defects.

popularity

By Ann Steffora Mutschler
Yield. Just the word can make many engineers cringe and hide in their cubicles—especially with manufacturing problems and excessive power during test increasing causing failures. But the combination of physical data with diagnostics engines may be the light at the end of the tunnel, allowing for easier pinpointing of defects.

There are many reasons why a chip fails on the tester, and most fall under two areas. One area includes manufacturing problems, explained Bassilios Petrakis, director of product marketing for front-end products at Cadence. “You can think of a case where the wires are not manufactured ideally, so sometimes they might touch because of a short. Or sometimes the wires are thinner than they should be, which causes more resistive type behavior in the wire or they get disconnected. There are tons of defects that are either known, or new ones that come up with new processes.”

Another class of problems is due to excessive power when the chip is tested. “What tends to happen is people design a chip for tolerances when it’s being used in the normal function—in a cell phone or whatever it is—but you also have to consider what happens when you test the chip. We’ve seen failures on the tester. Customers tell us that. And after some work, most people tend to think they are manufacturing defects. A lot of them actually occur because there’s way too much switching activity on the chip that wouldn’t happen in functional mode,” he said, which can cause a chip to fail the test, or even damage the chip itself.

These issues traditionally have been addressed by the diagnostics tools associated with ATPG, but thinking in this area has expanded. Robert Ruiz, senior product line manager for test automation at Synopsys noted that diagnostics has been around for as long as ATPG because they are correlated technologies, but efforts have been underway to improve the accuracy of layout-aware/physical diagnostics tools by pulling in physical information to cut down the search base. “Older diagnostic technology looks at a wire as one node. Everything is connected to it and it can’t pinpoint out. Using physical diagnostics understands that this is not one, single continuous wire but it fans out—there are branches on it. When I think of branches, I think about a defect on that tree. You narrow that tree down among several trees, but if you have more physical information about the tree you know if a defect is along one branch.”

Physical diagnostics use is on the rise due to the accuracy it promises. Connected to this is the need for the infrastructure to support it, which prompted Synopsys to add the ability to read in LEF DEF so it can become layout-aware and run diagnostics, he said.

Alongside traditional diagnosis is volume diagnostics, which brings in another dimension of data into the diagnostic tools. “What’s driving this is the fundamental trend of moving away from individual test engineers diagnosing a fairly small number of failing parts to a much more systematic process,” explained Cy Hay, product marketing manager for ATPG and diagnostics at Synopsys. “It’s not really changing the traditional application. It’s an expansion beyond using diagnostics in an almost offline manner, where there is a test engineer sitting in a lab on a tester with a few failing parts that maybe came back as returned merchandise, to a much more systematic application of diagnostics to improve yield. Especially with the types of yield limiters that customers see today, combined with the pressure to not only bring up a new process node but new designs manufactured in very high volumes, very quickly—there’s clearly a lot of pressure there. The traditional techniques that the fabs have used to improve yield don’t always apply or don’t work as efficiently on some of the more advanced process nodes, especially the aggressive and large designs that are being thrown at a new process node.”

New technology, new challenges
With all of this great new technology, there are challenges too. Geir Eide, DFT product marketing manager at Mentor Graphics, pointed out that part of the challenge now is getting people to talk together. “People who don’t even know the other guys exist—basically connecting product engineers, failure analysis engineers, test engineers, physical design and library designers—you’re basically giving these guys some technical capabilities that allow them to communicate.”

The problem is that they’re not used to talking to each other even though there is much to be gained by opening up this kind of conversation.

“You’re dealing with the fact that something that is beneficial for the product engineers, that will make his job easier, for him to be able to use these capabilities he needs the DFT engineers and the test engineers to enable that,” Eide said. For instance, for diagnosis to work, the data collected on the tester is needed, which means that the test engineer needs to facilitate that. While that doesn’t necessarily makes his job harder, it’s just something that he needs to do to help someone else’s job easier. This means there is a lot more interplay across different organizations and sometimes even across companies where test is outsourced, failure analysis might also be outsourced.

“Part of the beauty of these types of tools is we make it possible for multiple organizations to do the analysis so you don’t need to sit in the fab to do the analysis. You don’t need all the fab data to do this. From that point of view, the challenging piece here is that you have to share data across organizations that normally wasn’t shared, and you have to store the data and make it accessible. On the other hand, you can use software to do a lot of work that previously required you to take a lot of physical devices and slice and dice them and kind of do a lot of expensive, time-consuming analysis,” Eide added.

“The test guys don’t usually give a rip about the physical design—it’s just there. What we are trying to do is being able to make it so that they don’t have to know all the rules and things that they are looking for,” stressed Jeff Wilson, Calibre product marketing manager at Mentor Graphics.

During Mentor’s recent User2User conference, Global Foundries discussed combining diagnosis results that originate from test results and using real silicon data to help validate and identify the yield-limiting design features which ties into DFM tools.

There are a lot of ways things can go wrong. Whether it is a lithography problem, a fill problem, or other issue, being able to determine the root cause of that that problem and being able to fix it is the area that probably is going to bring the biggest benefit to the designers because the test guy shouldn’t have to know all the stuff about the physical design to make things work, he concluded.



Leave a Reply


(Note: This name will be displayed publicly)