It’s All About Staying Ahead Of The Test Challenges Curve

As test pattern compression falls behind, new techniques are needed to keep test times in check.

popularity

Since the early days when semiconductor devices contained a mere handful of gates, the manufacturing test world has been focused on how to detect the greatest number of potential defects in the shortest amount of time. This fundamental goal has not changed over the years and continues at 5nm and beyond.

What has dramatically changed over the years, however, is the variety of techniques used to achieve maximum efficiency. In the early years, functional patterns were used to test devices and these patterns were fault graded to determine the achieved coverage. As device sizes grew, it quickly became too difficult to create an efficient set of functional patterns that achieved the necessary quality levels. This led to the emergence of structural test with the development of scan-based testing, which had provided the necessary test efficiency for many years until exponential design size growth finally took its toll.

As is often the case, need drives innovation and the concept of test pattern compression was developed to counter fast growing pattern counts. Over the past fifteen years or so, test pattern compression algorithms have evolved to steadily increase achievable compression levels to as much as 1000X today. Algorithmic compression is now losing steam and new techniques are needed to keep test times in check. One very promising new approach is the use of machine learning techniques to help guide the pattern generation process. Initial experimental results indicate significant reduction in test pattern counts.

Another approach, quickly growing in usage, attacks the test time reduction problem from a completely different angle. Rather than trying to further improve compression algorithms, the idea is to modify the design itself to make it inherently more testable and thus require fewer test patterns. This approach involves making small localized modifications to the netlist, called test points, to increase the controllability or observability of specific internal signals. In general, the more test points you add, the greater the reduction in test patterns.

A major issue in using test points, however, is the added area they impose. It’s not uncommon to end up with a 3% to 5% area overhead to achieve the desired pattern count reduction. Now, since each test point consists of a flop driving a couple of gates added to a functional net, it is possible to drastically reduce the total area overhead by sharing flops across multiple test points. As much as a tenfold reduction in area overhead is possible.

However, it is critically important to share flops across test points that are physically close together. Without physical-awareness, you end up with a congested mess of wires that is impossible to route. The Synopsys integrated solution consisting of SpyGlass DFT ADV for test point selection along with DFTMAX for physically-aware test point synthesis uniquely avoids this huge problem.

There’s no end in sight to the test challenges curve. Continued test solution innovation will no doubt be needed to keep up with growing design sizes and evolving quality and reliability requirements.



Leave a Reply


(Note: This name will be displayed publicly)