Signals In The Noise: Tackling High-Frequency IC Test

Millimeter-wave frequencies require new test approaches and equipment; balancing precision with cost-efficiency is the challenge.

popularity

The need for high-frequency semiconductor devices is surging, fueled by growing demand for advanced telecommunications, faster sensors, and increasingly autonomous vehicles.

The advent of millimeter-wave communication in 5G and 6G is pushing manufacturers to develop chips capable of handling frequencies that were once considered out of reach. However, while these technologies promise faster data transfer rates, greater bandwidth, and new possibilities across a range of industries, the testing and validating of these devices presents a host of new issues that need to be addressed.

“When you hit high frequencies, connectors become a problem,” says Marcus DaSilva, an NI fellow specializing in RF measurements. “A 1mm coaxial connector can handle about 110 GHz, but beyond that you’re in rarefied territory. The complexity is further compounded by the need for precision in every step of a signal path, especially when dealing with phased array antennas and waveguides.”

As frequencies rise, maintaining accuracy also becomes more difficult, complicating the ability to ensure reliable performance. Traditional test methods, refined over decades, struggle to keep pace with the intricate requirements of millimeter-wave frequencies and sub-5nm node architectures. At these scales, issues such as signal integrity, noise isolation, and calibration precision become more acute, making it difficult to ensure reliable performance.

“Signal integrity is going to be difficult at the higher frequencies,” says Adrian Kwan, Senior Business Manager at Advantest. “The higher you go, there are more losses, reflections, and noise. Without precise calibration, you risk getting data that’s as noisy as the signals you’re trying to measure.”

Many companies also find that the standard general-purpose I/Os (GPIOs) traditionally used for testing are no longer sufficient for high frequencies. Instead, advanced testing methods like over-the-air (OTA) testing for wireless applications are being employed to ensure precise fault detection. High-speed serial interfaces, such as PCIe and SerDes, are needed to test at the high functional frequencies of these devices. This shift complicates the testing process, as it requires isolating high-frequency interfaces from the rest of the chip during testing to ensure accuracy.

“As more and more designs start to use high-frequency functional-based protocols, the use of standard GPIOs for testing the design is on a steep decline,” says Vidya Neerkundar, product manager for Tessent at Siemens EDA. “In some designs, the use of GPIO for functional purposes is approaching zero. Therefore, for testing purposes, the high-frequency functional interfaces need to be used.”

In response to these complexities, the industry is exploring innovative solutions, from leveraging AI and machine learning for data analysis to experimenting with new testing technologies. The focus is increasingly on achieving a balance between rigorous testing standards and economic feasibility. Yet, as demand for higher-frequency devices grows, especially in critical applications like aerospace and automotive, the need for accurate, scalable, and efficient testing becomes more critical.

“The big challenge we always have when developing new technologies is how to make testing affordable,” says David Vondran, wireless product strategist at Teradyne. “It’s very challenging to test any high-frequency device from a technical standpoint, let alone automate and commercialize it for widespread distribution.”

Technical challenges
Testing high-frequency semiconductors, especially those operating at millimeter-wave frequencies, presents a set of unique technical challenges. As the industry pushes beyond 50 GHz, the complexity of ensuring signal integrity, managing noise, and maintaining precise calibration increases exponentially. Each of these factors becomes more sensitive to even the smallest variations, making it difficult to achieve accurate and repeatable results.

“Testing high-frequency devices requires precise control over signal integrity to avoid escapes during the testing process,” says Vondran. “When results deviate significantly from the expected statistical distribution it can indicate underlying issues, such as calibration errors or misaligned test setups. If the data spread is wider than expected, it means you’re likely failing more units, or even rejecting good ones unnecessarily. Understanding these variances is key to adjusting testing parameters and improving overall test accuracy for high-frequency devices.”

In addition to signal integrity, the testing environment itself becomes more complex as frequencies rise. Factors like electromagnetic interference (EMI) and thermal variations can significantly impact the accuracy of measurements. Even small temperature changes can affect the performance of devices and the reliability of test results. Proper calibration is crucial to mitigate these issues, but it requires specialized tools and expertise that are not always available.

“Higher frequencies produce higher thermal variations that will affect performance, accuracy, and stability,” says Advantest’s Kwan. “There are strategies to mitigate this, like using active thermal control units, specialized thermal management materials, or even liquid cooling. It’s critical to manage these variations to ensure accurate test results.”

Additionally, the sensitivity of testing equipment itself becomes an issue. As signal frequencies increase, the ability of testing devices to accurately capture data and filter out unwanted noise is challenged, especially when testing systems need to handle multiple connections simultaneously. This issue is especially pronounced in applications like phased-array antennas, where precise alignment and measurement are crucial to ensure performance.

“As you move to higher frequencies, one of the key lessons we’ve learned is the importance of proximity between the antennas and the RFIC,” says Vondran. “Higher frequencies dissipate quickly, leading to signal loss and reducing the link budget. A major takeaway from 5G has been the critical role of integration between the antenna, substrate, and RFIC, which has become a standard form factor for approaching 6G. This integration also creates new testing opportunities, allowing for over-the-air testing rather than relying solely on traditional methods.”

Small firm disadvantages
Boutique design houses face some unique additional challenges in this high-frequency landscape, where the availability and cost of advanced testing equipment constrain their operations. Unlike larger players with access to cutting-edge automatic test equipment (ATE) and in-house testing capabilities, these specialized companies can’t justify the investment in high-end tools. This can limit their ability to thoroughly validate high-frequency designs and push them to seek creative workarounds, such as customized probe cards or in-house calibration methods — approaches that, while cost-effective, may not always deliver the same level of precision or repeatability as the more sophisticated systems available to their larger competitors.

“As a boutique ASIC design house, cost and availability of testers are critical challenges,” says Alan Wong, product manager at Ensilica. “We are a lot more careful in the design of the tests  since we don’t have access to all the resources at larger companies.”

Ensilica’s approach includes building specialized probe cards that integrate RF signal analysis directly into testing setups. This enables more affordable digital testers without relying solely on high-end RF modules, making high-frequency testing more cost-effective. The company also incorporates temperature sensors into its probe cards, which allows it to monitor both the probe card and the die temperature during testing, aiding in the management of thermal variations.

“Our RF test solutions are always very customized,” explains Wong. “We develop dedicated probe cards that handle the RF signal analysis and include temperature monitoring, making it possible to use standard digital testers while keeping the overall testing cost manageable.”

Repeatability
A critical factor in high-frequency semiconductor testing is repeatability, as it directly impacts the reliability and accuracy of test results. Achieving repeatable results means that a test, when conducted multiple times under the same conditions, produces consistent outcomes. This is especially challenging at millimeter-wave frequencies, where even small variations in test setup, equipment, or environmental conditions can lead to significant discrepancies. High-frequency signals are more susceptible to losses, reflections, and crosstalk, making the stability of the testing environment crucial for accurate measurements.

“Repeatability is a key challenge when testing high-frequency devices,” says Advantest’s Kwan. “You need to ensure that every time you conduct a test, the conditions are as consistent as possible. Any deviation can introduce noise or errors that weren’t present in the initial tests.”

One factor that significantly affects repeatability is the physical condition of the testing hardware, such as probe cards and connectors. Ensuring consistent probe contact with the wafer is necessary for reliable measurements.

“We’ve done a lot of work on the mechanical aspects of our probe cards, like measuring indentation depths and cleaning the probes,” says Ensilica’s Wong. “Once we established a consistent approach, we saw much better repeatability in our results.”

Calibration
The calibration of test equipment also plays a pivotal role for repeatability at frequencies beyond 50 GHz, where even minor calibration errors can lead to data that varies significantly between tests. One of the most challenging aspects of calibration at these frequencies is managing the transitions between different testing components, such as from the probe head to the load board, or through various connectors.

Each transition introduces the potential for signal loss, reflections, or impedance mismatches, all of which can distort the results if not properly calibrated. Engineers must account for these transitions through meticulous calibration procedures to ensure that the data being captured accurately reflects the device’s performance.

“One of the key aspects when testing at high frequencies is ensuring measurement stability over time and temperature,” says Charles Schroeder, an NI Fellow. “Even minor shifts can affect your measurements, especially when you’re dealing with such short wavelengths. It’s not just about calibrating once. It’s about maintaining that calibration consistently through multiple test cycles.”

These challenges mean that accurate testing at high frequencies requires both technological capability and strict control over every aspect of the testing process. From the design of the probe cards to the calibration routines and environmental conditions, every variable must be carefully managed to ensure that results remain consistent across tests.

“Repeatability of connections and signal paths is a problem that hasn’t been completely solved,” says NI’s DaSilva. “Even with advanced test setups, when you’re dealing with millimeter-wave frequencies or higher, small variations in temperature, connections, or even how a probe touches a wafer can change your measurements. It’s one of those challenges that we’re constantly working to improve, but it’s difficult to eliminate entirely.”

The role of AI/ML
Beyond the specific hardware challenges, the sheer volume of data generated by testing high-frequency devices presents another obstacle. As devices transmit larger amounts of data across wider bandwidths, the need for real-time data processing becomes critical. This is particularly true for applications like automotive radar and advanced communication systems, where latency can directly impact the functionality of end products. Advanced data analysis techniques, including artificial intelligence/machine learning, increasingly are being explored as potential solutions, but they must be tailored to the unique demands of high-frequency testing.

“AI can really be effective in terms of managing the data processing,” says Kwan. “It’s not just about collecting the data, but also processing it in real-time at the edge and identifying patterns, such as defects or statistical anomalies. This can significantly speed up the analysis phase and even help with making real-time adjustments during the testing process.”

However, AI/ML’s role extends beyond the testing phase, providing a means to monitor device performance throughout its lifecycle. High-frequency semiconductors, particularly those used in critical applications like aerospace and automotive, require consistent reliability. By embedding sensors within chips, manufacturers can use AI/ML to track operational data, predict potential failures, and adjust maintenance schedules before issues arise. This approach to predictive maintenance can prolong the life of high-frequency devices, ensuring that they perform reliably in the field.

“Testing throughout the life of the device is a must in order to prolong the life expectancy of designs/devices,” says Siemens’ Neerkundar. “RAS (reliability, availability and serviceability) is going to become not just essential, but a ‘must-do’ to track throughout the life of the chip and product, from manufacturing through repeat testing at the system level. With lower technology nodes, coupled with the devices running at high-frequency, the need for logical redundancy and plans for repair schemes are key to paving the way for increased life expectancy.”

AI/ML also offers the promise of optimizing testing protocols. By analyzing large datasets, AI can determine which tests are redundant and can be safely omitted, helping firms reduce both testing time and costs. This ability to streamline testing is particularly valuable for smaller players that need to maximize efficiency without compromising the quality of their test results.

“We generate a lot of data, and once we have that data, we use machine learning to find correlations between different tests,” says Wong. “It helps us decide which tests are redundant and allows us to reduce test time while maintaining high reliability.”

Despite these advantages, implementing AI/ML in high-frequency testing comes with its own challenges. The need for precision at millimeter-wave frequencies means that AI must be carefully integrated to avoid misinterpreting data or missing subtle variations that could indicate underlying issues. While AI/ML offers promising pathways for improving the efficiency and accuracy of high-frequency testing, they remain tools that complement, rather than replace, human expertise. High-frequency testing remains a complex and nuanced process that still requires the experience and knowledge of skilled engineers to make the most of the data.

Balancing cost and test
Testing at millimeter-wave frequencies requires advanced equipment and methodologies, which drive up the overall cost. Companies working with high-frequency applications face the challenge of justifying these costs while ensuring they don’t compromise on the accuracy and reliability of their tests.

“The economic aspect of testing is often overlooked,” says Schroeder. “In today’s economic climate, companies are cost conscious as they focus on maintaining or increasing their profitability. As a test and measurement company, our job is to give our customers options that allow them to balance measurement performance and cost. It’s easy to say we need the best equipment and techniques, but balancing that with the cost is where it gets tricky. You have to ask, ‘What’s the return on investment for each incremental improvement in precision?'”

One of the primary cost drivers is the need for specialized test equipment capable of handling frequencies beyond 50 GHz. ATE technology must be highly customized, with advanced capabilities that can meet needs for greater precision. These tools must be adapted to handle millimeter-wave frequencies, requiring upgrades in software and hardware that can quickly add up. The financial investment for such high-end tools is significant, making it particularly challenging for boutique firms to justify the expense.

“Automatic Test Equipment that can operate at high frequencies will continue to be important for testing 6G devices at wafer and package levels,” says Neerkundar. “Additionally, many customers are exploring the use of system-level test (SLT) platforms to test full systems at fast functional frequencies. However, testing with advanced test equipment alone may not be sufficient where chips have complex digital content.”

For many companies, achieving this balance means exploring alternative testing strategies, such as SLT platforms and conducting tests earlier in the design process to identify potential issues before they become even more costly to address. SLT, in particular, enables testing at the system level rather than just the chip level, providing a more comprehensive evaluation of device performance under real-world conditions. This helps to ensure reliability in the field, but the higher costs of completing packaging and assembly make the potential loss for scrapped parts that much greater.

“The big challenge we always have when developing new technologies is how to make it affordable to do the tests,” says Vondan. “It’s very challenging to test any high-frequency device from a technical standpoint, let alone automate and commercialize it for widespread distribution. The cost can be prohibitive, especially for smaller firms that don’t have the same budget as the larger players.”

The decision-making process for testing investment isn’t just about choosing the right equipment. Determining the appropriate level of testing for a given application is likewise critical. Some companies are experimenting with over-the-air (OTA) testing to validate wireless communication capabilities, while others are focusing on advanced probe card setups that allow for precise, on-wafer measurements. These methods each have their advantages and cost considerations, which need to be evaluated based on the specific requirements of the high-frequency application.

“We’ve had to make a lot of decisions around how to balance cost and functionality in our test setups,” says Wong. “We do a lot of upfront simulations, especially in the electromagnetic and thermal domains. It helps us understand what to expect before we even get to the testing phase, which allows us to focus our resources on validating the design rather than starting from scratch with costly trial-and-error testing. For us, it’s about finding that sweet spot where we can do enough testing to ensure reliability without breaking the bank on test equipment.”

6G promises and problems
The coming advent of 6G technology promises to unlock unprecedented data transfer speeds, connectivity, and new applications that could redefine industries ranging from telecommunications to autonomous systems and beyond. While 5G was a significant leap, 6G offers a 10X improvement in performance, which can enable new use cases like ultra-reliable low-latency communication (URLLC), enhanced mobile broadband (eMBB), and even tactile internet applications where real-time feedback is critical. This new generation of connectivity could pave the way for advances in areas such as remote surgery, immersive AR/VR experiences, and high-precision industrial automation.

“The promise of 6G is immense, with the potential to redefine not just telecommunications but also how industries interact with data in real time,” says DaSilva. “But achieving those speeds and that level of connectivity means we’re pushing into frequency ranges that come with a whole new set of challenges.”

The technical demands of 6G will require operating in the terahertz frequency bands, where signals are significantly more susceptible to attenuation, reflections, and interference. Unlike previous generations, 6G’s reliance on these ultra-high frequencies means that even slight variations in signal paths or environmental conditions can dramatically affect performance. This presents a unique challenge for testing, where maintaining signal integrity and ensuring device performance becomes even more critical.

“The biggest challenge with 6G is the nature of the frequencies themselves,” says Kwan.” At these frequencies, signals don’t bend. They reflect, and that changes how we design and test devices. Maintaining a clean signal path through testing is critical, and that’s not easy when the wavelengths are so short.”

Testing at these frequencies also will require new methodologies, such as OTA testing, which can validate how devices perform in real-world conditions. This is especially important as 6G will heavily rely on beamforming and MIMO (multiple input, multiple output) technologies to maintain signal strength and direct signals precisely. However, these testing methods come with their own set of complexities, including the need for specialized equipment and the ability to replicate consistent testing environments.

“Testing these functional interfaces independently to ensure structural integrity, irrespective of how they are designed for functional use, is key,” says Neerkundar. “In addition to I/Os, the memories and logic of high-frequency devices must be tested at or near functional frequencies using advanced timing-based fault models to ensure detection of delay defects.”

6G’s potential to transform industries also places pressure on the semiconductor industry to innovate rapidly. The race to be 6G-ready is not just about developing faster chips. It’s about ensuring that these devices can perform reliably across a wide range of environments and applications. This drives the need for early integration of testing into the design phase, as well as a focus on simulation to anticipate challenges before they occur.

“Fundamentally there has to be a compelling business model for millimeter wave technology,” says Teradyne’s Vondran. “It has to solve a problem in a way that you know is not as easy to solve with other kinds of technology, and that will be the pull that brings millimeter wave more to the forefront as opposed to being an exotic kind of technology today.”

Equipment cost is another factor that can’t be overlooked. Developing the infrastructure and testing capabilities to support 6G is a significant investment, particularly for smaller companies that may not have the deep pockets of larger industry players. For these firms the cost-benefit analysis becomes even more crucial, balancing the need to stay competitive with the reality of limited resources.

“For smaller design houses like ours, the question will be how to make 6G testing feasible without stretching our budgets too thin,” says Ensilica’s Wong. “We’re relying on partnerships, custom solutions, and a lot of creative engineering to make it work, but the financial hurdle is significant.”

6G’s potential advantages go hand-in-hand with these technical and financial challenges. For the industry as a whole, succeeding in the transition to 6G means overcoming these barriers as well as ensuring that the new testing methodologies and tools are adaptable enough to keep up with future innovations.

“The faster the technology evolves, the more pressure it puts on testing to keep up,” says NI’s DaSilva. “You can’t cut corners on testing because the stakes are too high, but you also have to make sure that the testing process itself doesn’t become a bottleneck that delays your time to market.”

A need for standards
As the industry gears up for the adoption of 6G, testing practices must evolve to meet new regulatory requirements and industry standards. Standards bodies like 3GPP (Third Generation Partnership Project) play a key role in defining the testing benchmarks for emerging wireless technologies, ensuring that devices meet specific performance and reliability criteria. These standards become especially important at higher frequencies, where maintaining compliance with safety regulations around electromagnetic emissions and ensuring interoperability between different devices and networks are critical.

“Standards would play a valuable role and guidance in testing high-frequency devices,” says Neerkundar. “The more protocols that come into play for these high-speed interfaces, the more it will help heterogeneous integration as well as planning for testing these interfaces seamlessly.”

Standards for OTA testing are particularly relevant for 6G, as they ensure that wireless devices can communicate effectively across a range of frequencies and real-world conditions. As new applications like autonomous vehicles and smart cities rely on consistent, high-speed connectivity, compliance with these standards becomes a key differentiator for semiconductor manufacturers. Being able to meet these requirements without adding excessive costs to the testing process is a balancing act that companies will need to navigate as 6G becomes a reality.

“We’ve seen some early experiments with OTA testing for 6G, and while it’s promising, it’s also clear that it’s not a one-size-fits-all solution,” says Wong. “It’s possible for parts to pass tests well within the performance parameters but still not function in a particular application. The challenge is ensuring that your test results correlate closely with how these devices will actually perform in the field. Without that, you risk misjudging a product’s capabilities.”

Conclusion
As the demand for faster data speeds and more reliable connectivity grows, so do the technical challenges associated with testing these advanced devices. From maintaining signal integrity and thermal stability to adapting testing methods for new frequency ranges, the industry must navigate a complex landscape. Each of these hurdles underscores the need for continuous innovation and the integration of new tools and methodologies. The shift from conventional testing approaches to those that incorporate AI/ML and advanced simulation marks a significant evolution in how companies approach these challenges.

Yet even with these technological advancements, the balance between precision and cost-efficiency remains a critical concern. The high costs of specialized testing equipment, along with the need for advanced expertise in managing these systems, create barriers that not every company can easily overcome. However, as firms adopt creative solutions, such as custom probe cards, over-the-air testing, and strategic industry partnerships, they can continue to drive progress. This collaborative approach ensures that high-frequency testing keeps pace with the rapid technological advancements on the horizon.

Related Reading
AI/ML’s Role In Design And Test Expands
But it’s not always clear where it works best or how it will impact design-to-test time.
Doing More At Functional Test
New approaches for cutting costs and improving reliability for increasingly complex chips.
Managing EMI in High-Density Integration
Controlling interference in today’s SoCs and advanced packaging requires a combination of innovative techniques, but new challenges emerge.



Leave a Reply


(Note: This name will be displayed publicly)