Module Testing Adds New Challenges

Technology is shaping up as system-level functional test.

popularity

System-in-package (SiP) and other advanced packaging technologies are putting more components together in tighter spaces than previously seen. Often these packages are contained in a module, which is something more than a chip package and a great deal smaller than most printed circuit boards.

Testing these modules often requires system-level test. These modules typically will be inserted into small form factors, and they will serve as the electronics system for such products as Internet of Things devices, smartphones, and wearable gadgets.

Functional testing of memory modules—DRAMs contained in single in-line memory modules (SIMMs) and dual in-line memory modules (DIMMs)—has been going on for years. There are SRAM-based memory modules, as well. But testing memory chips and memory modules is relatively uncomplicated compared with testing heterogeneous integrated modules that combine multiple different chips.

An SiP implementation for IoT may involve a module that contains sensors, some type of processor (an application processor or a microcontroller), radio-frequency chips for wireless connectivity, memory, and a power management IC. That’s heterogeneous integration, and it’s a challenge to test.

The industry is turning out power semiconductor modules (including rectifier modules, Schottky modules, insulated-gate bipolar transistor modules, MOSFET modules, and IGBT/MOSFET modules), radio-frequency modules, transceiver modules, and photovoltaic solar modules (including flexible PV modules), among other applications. Work is heating up on display and projection modules containing micro-scale light-emitting diodes, as well.

“The whole concept behind ensuring maximum quality in multi-chip packages is about the smart pairing, how quality is impacted for the key device, based on the continuing quality factors of the device in question,” said David Park, vice president of worldwide marketing for Optimal+. “The last thing you want to have in a multi-chip package is to have a megabuck part on there, but the whole MCP test could fail because the $2 part failed. The challenge in MCPs is ensuring that the performance of one part doesn’t too negatively impact the performance of the overall device. The only way to do that is to use big data analytics.”

Park noted that one of the key concerns is what effect multiple parts in the module have on the overall quality of device. “You want to screen out the parts with a negative impact on the final yield. What are the right kinds of tests and parametrics to make these types of devices together, with these quality specs and these parametric performance specs? Then the overall end product is going to work out great. You bring in other types of devices, they’re still good devices, but they just don’t work well with the other parts on the board. We spend a lot of time trying to do this randomly. Once we identify the signatures that are important, that automatically helps them sort and pair the right devices for their MCP. We want to get to the bottom of testing, not only the components of the SiP or the MCP, but then the final SiP in itself. Vastly downstream, that information will have value to whoever is aggregating all these things onto a populated board or a system or whatever it may be. It’s taking advantage of the manufacturing test equipment all these companies have anyway, which are already generating tests. Leveraging that benefits not only them. It benefits the entire supply chain. That’s the big picture.”

More than a PCB
Module testing is different than testing a PCB.

“The module testing trend kind of fits in between the idea of chipsets as conventionally known or PCB test,” said Luke Schreier, director of automated test marketing at National Instruments. “Both can be viewed as kind of a system, in the types of ways it will interact with them. On the chip side, it was more focused around blasting vectors at it and checking for the response. On the board level, it was very much more geared toward, ‘Does this thing work as assembled to accomplish the functional task?’ While there may have been individual tests at some of the subassemblies or pieces, even like in-circuit test to check for specific components, the idea typically was very much more system-level test, functional test. This module concept sits somewhere in between, where you can’t get away with the typical ways of testing a chip. You want to be able to capture any defects that may come from the interconnections between different elements of a system-in-package, or you want to be able to capture defects that may come from firmware, possibly running on a processor inside it, and the only way you can feel you can do that is putting it in its native environment when you perform the test.”

And that begins to look more like board test.

“It’s kind of a place in between the two worlds right now, and trying to find its way a little bit from which ATE paradigm it’s trying to borrow from, Schreier said. “It starts from the perspective that there are elements for which you might still want to be able to use conventional digital functional test, using vectors or being able to assess what you can, before just immediately going to a full-scale system-level approach. On the other hand, it acknowledges the limitations of the insight that you would get from the pure chip side. On the board side, the remaining goal is to capture more of the end use of the product, like a customer would expect—or like a customer will experience from using it. In that way, it has typically has a lot more interactions with the firmware or software environments than just the individual functions of an IC. All the pieces have to work together if the customer experience is going to be able to meet expectations. So it starts to have higher levels of the stack involved, to be able to really understand whether the device is working as you’d expect. Sometimes, that also might mean running it for longer periods of time. That’s where some of the interactions between burn-in and system-level test come from at the module level, maybe how that lineage has crossed over each other in the past few years.”

But not all modules are the same, either.

“One thing that may be a little bit more similar to the chip is if it’s an RF type of chip, you need to actually perform some final test at the system level,” said Joey Tun, NI’s principal market development manager for semiconductor test. “When you put a module together, you’re talking to it more like how you would a complete assembly. We [at NI] tend to be more on the RF, analog side of things. So, for example, we have experience with Wi-Fi modules, we have experience with some of the IoT modules that may have both Wi-Fi and Bluetooth integrated. Those are the type of modules that we normally work with.”

This isn’t always intuitive, though.

“A lot of the test people shy away from shifting to module test or system-level test because the mechanics of dealing with it aren’t just the typical handling or interfacing environment of single chips,” said Schreier. “They could be more complicated assemblies. Sometimes, system-level test systems end up looking quite elaborate mechanically, and they take up a significant amount of floor space to accommodate all the robotics or other capabilities necessary to do the test. It can start to look more like board testing. That’s certainly true where the pick-and-place or the kind of gantry-like structures that exist for putting boards on trays, to be able to be tested in parallel, starts to look quite a bit different sometimes from a typical chip ATE environment. Testing times take quite a bit longer. On some levels, it looks like neither chip nor board testing immediately, but kind of a hybrid where you’re still trying to achieve massive parallelism, but not at the same speed that you find in the chip ATE world natively.”

Schreier noted that NI has “a good working relationship with Astronics,” which is a proponent of implementing system-level test for today’s electronics. “We’re collaborating with them.”

Tun said module testing is mostly driven by the end-market application involved, such as IoT devices and smartwatches.

SiP, modules, and package-on-package technology are almost the same, Tun noted. Module testing can be done at the device protocol level. “You need the protocol specific to that module. It’s not like digital. It’s sometimes a USB protocol.” Module testing can depend on variations in the device-under-test.

“Market opportunities are on the rise,” Tun concluded, with IoT modules and the RF space. “These are the primary market opportunities.”

Module testing could be coming soon to a test floor near you.



  • Michael Laisne

    Hi Jeff, Don’t forget IEEE P1838, which is standardizing techniques for SIP testing.