Building Bridges: A New DFT Paradigm

Greater complexity and smaller process nodes are driving a major shift in design-for-test implementation.


By Robert Ruiz

Over the last twenty years, structural testing with scan chains has become pervasive in chip design methodology. Indeed, it’s remarkable to think that most electronic devices we interact with today (think smartphones, laptops, televisions, etc.) contain hundreds to thousands of interconnected scan chains used to verify that the semiconductors were manufactured without defects. Because the impact to design performance, power, area (PPA) and schedule was minimal, structural scan test was cost-effective and adoption was swift. Over time, however, scan’s negative impact has become notable and often significant. Interestingly, one widely deployed DFT technology alleviates PPA degradation, and another technology effectively reduces schedule delays. What’s needed to address today’s key DFT implementation challenges is a combination or “bridge” of the flows based on these technologies.

First, let’s understand how we got to this point by examining the evolution of DFT technologies into its two main branches: RTL-based and synthesis-based. In the early adoption phase of structural test, DFT was synonymous with scan. At that time, standard flip-flops were replaced with their scannable versions then stitched into long shift registers. This simple technique massively increased the controllability and observability of the design, thereby enabling algorithms to easily and automatically generate test patterns. Initial technologies for scan implementation was straightforward and involved either netlist editing or modifications during synthesis.

As designs became more complex so did DFT logic to address the constantly growing need to maintain or reduce silicon test costs. Perhaps the IEEE 1149.1 standard was one of the first drivers to increase DFT complexity since the standard defined a test access port (TAP) consisting of data, control signals, and a controller with sixteen states. DFT complexity continued to increase with the introduction of additional techniques and standards such as scan compression, IEEE 1500-compliant core wrappers, memory built-in-self-test (BIST), logic BIST, IEEE 1687-based access networks, on-chip-clock controllers, and so on.

To accommodate these DFT logical complexities, technologies originally based on netlist editing transformed into RTL-based technologies. These technologies allow designers to create an RTL view of the DFT logic, which is especially important for highly complex DFT architectures. RTL DFT enables earlier validation, typically with Verilog simulation, compared to gate-level validation –  significantly increasing productivity and known in the verification world as a “left shift.” Designs teams avoid schedule delays as result of this early DFT validation. The major downside is that complex RTL DFT does not contain physical guidance for modern synthesis tools, thereby reducing PPA optimization while increasing synthesis run times.

Figure 1: Synthesis-based DFT (right) avoids compression logic congestion of RTL-based DFT (left)

Separately, synthesis-based DFT technologies evolved to handle the same type of DFT complexities while minimizing the impact on PPA and run times since synthesis engines recognize DFT logic while it is being generated (Figure 1). However, designs teams must wait until the synthesis runs are complete to obtain gate-level views of the DFT architecture. For complex architectures, this is too late in the schedule!

To obtain the best of both worlds – early validation and optimal design PPA – a bridge must be built between RTL and synthesis-based technologies. The bridge must carry over the RTL DFT logic and physical guidance from the RTL flow into the synthesis flow. New technologies, like those in Synopsys TestMAX products, enable these flows and the bridge in between.

RTL and synthesis-based DFT flows developed separately. Each of the flows has strengths. Now, a new DFT implementation paradigm arises that achieves the best of both flows by combining them with a technologically-advanced bridge.

Robert Ruiz is the director of product marketing for test automation products at Synopsys, Inc. Ruiz has held various marketing and technical positions for the test automation and functional verification products at Synopsys, Novas Software and Viewlogic Systems. His background includes over 17 years in advanced design‐for‐test methodologies as well as several years as an ASIC designer. Ruiz has a BSEE from Stanford University.

Leave a Reply

(Note: This name will be displayed publicly)