Rethinking Test

Low-power designs can create false failures with existing test approaches, unnecessarily lowering yields.

popularity

By Ann Steffora Mutschler

The responsibility of semiconductor test has long sat solely with the test engineer as the chip designer focused on the functionality of the device. However, particularly in low-power designs, when the device is being tested, much higher power levels are applied than normal functional operation – sometimes causing the device to fail.

This ‘false failure’ can lead to unnecessary yield loss on the production line requiring significant time and effort to diagnose because the extra power applied to the device may indicate incorrectly that the device is bad when it is not.

The goal of the test engineer is to reduce the cost to test a device. Therefore, they want their automatic test pattern generation (ATPG) tools to generate a lot of activity and test a lot of the chip. As a result, a lot of power is being consumed—typically exceeding the functional power budget between 7x to 10x.

This occurs because the chip is designed with a power budget in functional mode. “If you think about the design of a chip, most chips aren’t operating all parts of the chip at the same time and ATPG doesn’t look at functionality — it just looks at the structure and to minimize the cost or minimize the patterns it’s trying to make as much activity happen in the chip in order to get test all simultaneously,” explained Robert Ruiz, senior product marketing manager for test automation products at Synopsys.

In the past, ATPG tools really didn’t need to look at power consumption — the chips were small enough, the power rails were big enough, and there wasn’t a big prevalence of low-power designs. On top of that, there weren’t compression techniques being used, which further exacerbates the problem because the goal of a low-power design is to minimize switching activity, while the goal of compression is to maximize it. This is a very big deal for test engineers, but it is not an issue traditionally highlighted in the design community given designers’ focus on functionality—even though designers may take partial ownership about how to implement some of the design-for-test solutions.

Ruiz indicated that approximately three years ago the impact of power on test became a big area of Synopsys’ R&D effort based on feedback from a number of customers. At that time, he said, there were some customers who reported power issues related to test. They did some redesign, which resolved the issues at hand, but believed it could be a problem in the future. “It has certainly evolved to the point where most customers say they definitely have found a power issue during test,” Ruiz said.

Test is tricky for low-power designs
Greg Aldrich, director of marketing for the Silicon Test Systems group at Mentor Graphics Corp. said one of the problems in test is how to create test patterns that have lower power profiles in terms of what data gets shifted in, which is dramatically complicated by the use of on-chip compression and on-chip test structures. Previously, test was performed by shifting data into scan chains, issuing the clock cycle, shifting the data out, and then comparing it to the golden response data, whereby the scan chains were directly connected to the tester.

However, most designs today utilize either built-in self test (BIST) or on-chip/embedded compression, which is still a deterministic process. But instead of the tester directly shifting data into the scan chain, there is a decompressor that it goes through that sits on chip. The tester shifts data into the decompressor, which is expanded internally, essentially creating the data on-chip, Aldrich explained.

What complicates the process is that since the data is being created on chip a new on-chip piece of logic must also be created, so Mentor invented a new low-power decompressor that allows the designer to control the stuff on chip, he said. “It’s not as simple as just changing what’s on the tester. You actually have to change some of the embedded test logic on chip to be able to control that. I think that is going to be primarily how switching activity is going to be controlled during the test—by controlling how the test patterns are created and then how the test patterns are loaded.”

Similarly, Synopsys rolled out an ATPG approach that doesn’t require any hardware or DFT change (which no customer really wants to do), Ruiz said. The company’s TetraMAX tool was enhanced about three years ago to allow the user to dial in a budget of the switching activity, which serves as a proxy for power consumption. And, if a customer wants to be more aggressive and active in managing power consumption, there are other hardware techniques including Synopsys’ DFTMAX tool as it puts off the scan chain.

Likewise, Mentor’s Aldrich noted that in terms of innovations both on the design side as well as on the test side to help deal with the impact of power on test, “It’s all focused on how to reduce the switching activity during the test. Historically, a lot of that has been done by partitioning the test and that is still the case especially as you move to designs that have multiple voltage domains or multiple power islands. Being able to just sequence the tests for each one of those allows you to test a smaller piece of the design. That has some implications on the test time and cost that it takes to test the device but that’s one approach.”

Mentor has also added more control into its tools as to how much is switching during the test process. For example in its ATPG tools, users can specify constraints to the test pattern generation tool to indicates how much switching is allowed during the test pattern.

“The more aggressive they are in terms of lowering the amount of switching during the test process, the higher it is in terms of test costs. It’s going to take more test patterns, it’s going to take more compute time to create the test patterns but it is a knob they will have control over now. They really have no other choice other than designing the power structures in the design such that they can handle 50% switching activity—that’s the only other alternative,” Aldrich said.

In the end, the objective of test is to create the highest coverage in the smallest number of test patterns. What that means from the perspective of the design, it means you want to try and switch on everything possible in the design on every cycle on the tester—and that’s the opposite goal of low-power design. That said, a complete rethinking of compression algorithms and other test technology is in order.



Leave a Reply


(Note: This name will be displayed publicly)