The Abstraction of Test

popularity

By Ann Steffora Mutschler
By now, semiconductor design abstraction is old hat to many engineers, but mention the term “semiconductor test abstraction” and expect a blank stare in return. Design complexity, enormous design size, and short market windows have put tremendous pressure on test to occur earlier rather than later.

Even at the RTL level, where hardware test typically has not touched, the name of the game is reliable predictability. If testing of circuitry doesn’t happen until the gate level when the final connections happen, big bottlenecks occur. For this reason, vendors in the hardware test equipment as well as in the EDA space are looking now at ways to connect design and test closer than ever.

While the term ‘abstraction’ is used with regard to both design and test, how it plays out in those spaces is very different. However, Robert Ruiz, senior product marketing manager for test automation products at Synopsys, pointed out there may be a few corollaries between the two.

“First of all, [test abstraction] is something that’s not done by the majority of users, and because of that the term isn’t interpreted by everybody equally,” Ruiz said. “For the design engineer, what the abstraction could refer to in terms of a specific flow is that there are models using the IEEE 1500 standard that can describe the test attributes and specifically the scan chain in a design. Abstraction of test for a designer who is implementing the design and focusing on its function, he just cares about having some way to carry over an abstraction of the DFT part, and that helps in multiple ways. One is that it hides all the unnecessary details to the designer. The second thing is that it helps with file management and tool performance so the tool doesn’t need to carry around larger netlists describing the scan chain and the other DFT information.”

Mark Chadwick, product marketing manager, for Mentor Graphics Corp.’s Silicon Test System products said that testing at different levels—be it the wafer level, the package level, when it’s in a circuit board, when it’s in a system, when it’s in the field—can all be thought of as different levels of abstraction. “However, unlike the term, the type of test we do is still down to a structural level, meaning, in terms of scan test or memory BIST, the function is not tested but by a process of testing every individual device/component/atom, we do test the whole chip.”

Ruiz noted there are certain things for test that can be done at the RTL level, but ultimately to generate high quality manufacturing test patterns that capture every single defect, the test generation program has to understand the structural connections—basically the gate-level view.

“There are a set of things that can be done at RTL, but not everything,” Ruiz said. “Nonetheless, it feels like we’re at a point where not most, but some, designers with very forward-looking glasses are thinking about what can be moved upwards beyond the gate level to the RTL level. In looking at that, there are a couple of different things. One is predicting what kinds of problems could occur if DFT is put in. Another, still in exploratory stages, is instead of going through the entire design flow and finding out what my fault coverage is and how much tester time and how much test data I have, what if I could do some prediction at RTL? Prediction at RTL is for a lot of these test engineers is abstracting test.”

Outside of EDA, abstraction also could be applied to test development and could mean that instead of looking at ones and zeros, and defined analog waveforms that are applied to the chip for test, what if this were abstracted?

“With this perspective, knowing my set of tests that just test the core, or a set of tests for the analog parts, I can therefore be more structured about the test development program,” Ruiz said. “Test development programs are an area where some of our largest customers invest a lot of resources, more than the investment in DFT/ATPG.”

Conversely, from the test hardware point of view, John Wiedemeier, product marketing manager at LeCroy, said that when his company started off 15 years ago things were very complicated and getting equipment to talk together was extremely painful. “We’ve changed quite a bit and introduced what we think is a standard for looking at protocols,” he said. “Our company is all about decoding protocols and understanding communication between devices and hosts. In the beginning, an engineer was looking at an oscilloscope screen full of ones and zeros. As things became more complex with digital design, the logic analyzer came out to solve certain problems and was a new interface. That was a monumental effort. Then someone thought, what if we could decode these ones and zeros, and so there were some rudimentary logic analyzer to decode the ones and zeros into numbers. They took it up a notch. But in the process, they were able to do some things but stepping aside and not doing what their core competency was.”

It was about that time that LeCroy came out with a protocol analyzer, a step above a logic analyzer. “We’re no longer just decoding ones and zeros into numbers but we’re now actually giving great detailed information about what a protocol looks like,” Wiedemeier said. He noted that if you read a book on the protocol, you were able to look at the pages and the tester screens and understand what was going on. There’s no longer the idea that you have to know anything about analog to do the software work and programming. It has gone a step above that to encompass putting the book into the tool and having the tool tell say what the book says, and now even to where a button can be pushed and the tool say what is wrong and how to fix it.

“It is much friendlier to the software engineer and gets away from only geniuses reading the waveform,” Wiedemeier said.

Connecting Test with the EDA Realm
While not an obvious connection at first glance, Frank Ditore, product manager of Agilent EEsof EDA in the Wireless Business Unit at Agilent Technologies, pointed out that the EEsof EDA tools do link with Agilent’s measurement hardware, and some recent work with software- defined radio (SDR) does apply here.

“First off, it’s not necessarily a well-known or well-documented fact, but Agilent (when it was Hewlett-Packard) was probably the pioneer in SDR architectures simply because we were developing instruments that needed to be scalable and reconfigurable. In fact, in the late ’80s and early ’90s we developed a new hardware measurement system that was called a vector signal analyzer—the first of its kind on the market. It literally was a software-defined radio. This allowed the engineer to take a firmware upgrade, which basically updated source code for embedded processors, so the instrument could be reconfigured to demodulate IS-95 CDMA or DECT or any of the digital communication protocols that were available at the time. Even before some of the more industry-notable names in software defined radio starting coining the term ‘software defined radio,’ we were actually already kind of doing it. Today, if you look at Agilent hardware measurement platforms, they are almost entirely software-defined instruments being that they are scalable, digital signal processing engines made up of FPGAs and embedded DSPs; they typically are built around a Windows PC framework (although that doesn’t cover every one of Agilent’s products) and the hardware is scalable so you can continue to increase the amount of horsepower you need for different applications by plugging in new application cards into the instruments,” he said.

Those are basically software-defined radio platforms because they allow Agilent to take an instrument and apply it to a number of different protocols without redeveloping the hardware.

“This fits into test abstraction in the context of standardized operating environments for SDRs,” Ditore said. “There are a number of operating environments, with the leading one being the Software Communication Architecture (SCA) Operating Environment (OE) standard, provided by a number of vendors for use in the armed forces Joint Tactical Radio Systems (JTRS) or commercial systems. In the radios that support SCA, the actual physical layer becomes transportable if it is also SCA-compliant. What we’ve been talking about internally is how to abstract test to create SCA-compliant test algorithms that would go into our instruments (assuming they were SCA-compliant) as well as user hardware that is also SCA-compliant so that you can actually build test right into your product.”

While not a commercial product, it fits well with how Agilent’s tools are architected, he said. For instance, SystemView, Agilent’s ESL tool, can be used to architect physical layer signal processing and analog RF algorithms either as arbitrary waveforms or an executable hardware description in VHDL, Verilog or C++ and export it into a piece of hardware.

As such, it can be used for design, or to define the test algorithms, and has the potential to save engineering teams a tremendous amount of time, he believes.

“If you look at traditional IF/RF designers, whether they are designing MMICs, RFICs or just standard RF boards, they typically design with continuous wave (CW) metrics in mind – what’s the gain, what’s the noise figure, what’s the output IP3 – but typically they don’t think about the higher level system metrics like what the error of vector magnitude is, what the ACPR is, what the co-domain power is, what the cross-domain leakage is between co-domain systems. They don’t necessarily think in high-level time domain metrics, so by providing these virtual test environments in a simulation environment you can actually connect your analog and RF processing to a test harness that actually does that. Typically the types of things you would do once you prototype hardware you’re actually doing in simulation while you are designing, while it is still cheap to make changes. A change is a tweak in a parameter, it’s not a respin of a mask,” he added.

What’s next?
For customers working intensely on I/O engineering for add-in cards and optimizing performance in the end system, industry sources believe there will be more integration of test tools with pre-silicon design engineering teams to allow feedback earlier in the design process. This is expected to materialize with tighter partnerships between EDA and traditional tester companies in the not-too-distant future. As the saying goes, stay tuned.



Leave a Reply


(Note: This name will be displayed publicly)