Enhancing the connection between data sources and faster, more reliable decision-making.
Dr. Ming Zhang, PDF Solutions vice president of Fabless Solutions, delivered the keynote at the TestConX 2025 conference in March. As he began his presentation, Ming borrowed the “learn, explore and share” line from Ira Feldman, the organizer of the conference, to set the tone of his talk. He promised to share what he and PDF Solutions learned and what’s useful and what’s not useful as he described the new frontier of AI for test and the industry shifts driving it.
While challenges related to data complexity, model adaptability and security remain, advancements in AI modeling, connected data systems and adaptive testing strategies offer a vision of what’s possible. Semiconductor manufacturers that invest in AI will enhance their processes and position themselves as leaders in this new frontier, he opined.
Testing is a tie between design and manufacturing, and that tie is fraying because of advanced packaging complexity and multi-vendor partnerships. Test engineers now face challenges including design complexity, process variability and supply chain security. AI could address these challenges by enhancing the connection between data sources and faster, more reliable decision-making.
By acknowledging the interplay between data, modeling and infrastructure, stakeholders can unlock the full potential of AI for semiconductor testing, Ming believes. The path forward is one of learning, exploring and a shared commitment to overcoming challenges through innovation, Ming added.
It’s not as easy as it seems, he acknowledged. Integrating AI into semiconductor testing presents unique challenges that span technical, operational and logistical domains. For example, semiconductor data is heterogeneous with numerical results, parametric measurements and visual records. Inconsistencies across datasets increase the difficulty applying generalized AI algorithms.
Model maintenance is a huge consideration since AI models in manufacturing require continuous monitoring and maintenance due to changing physical conditions, such as equipment calibration shifts and evolving process outputs. This necessitates real-time adaptation to ensure accuracy.
Not to be overlooked are deployment constraints that vary depending on requirements. Edge computing offers control at the equipment level, while server- and cloud-based deployments may provide more complex analytics, adding latency trade-offs.
Security sensitivity may be the most important consideration today given that testing data is inherently sensitive, encompassing yield, cost and coverage. Maintaining security while leveraging AI insights is a balancing act.
That’s not to say there aren’t opportunities for AI in semiconductor testing, Ming reasoned. AI unlocks enormous potential across several testing applications, starting with adaptive testing. It enables dynamic testing strategies based on historical component data, optimizing test coverage and cost. For instance, AI can ensure higher-quality testing for critical components while minimizing redundant tests for less demanding applications.
An intelligent system binning tool that groups chiplets with similar characteristics could maximize economic value. This approach will become increasingly important as chip designs shift toward multi-die configurations.
Predictive binning and burn-in reduction could as well. Predictive algorithms will identify potential failures earlier in the chip’s lifecycle, reducing costs associated with downstream testing and packaging. This will be especially relevant for optimizing system-level testing (SLT) in complex AI systems.
In another example from Ming, connected data systems could integrate information from wafer fabs, design offices and assembly lines. These systems would enable AI to provide actionable insights, ensuring high-quality output while supporting long-term reliability and security.
Toward the end of his keynote, Ming pointed out the need to address the challenge of small data. Unlike environments with vast datasets, such as in the case of internet search, semiconductor testing is often limited to highly variable data sets, highlighting the need for adaptable AI strategies that require further exploration. One could be real-time monitoring—continuous tracking of data health and model performance for timely adjustments. Another could transfer learning using insights from one product or factory to accelerate AI deployment across different scenarios and partitioned architectures that would separate data management, model development and automation.
Generative AI offers an exciting avenue for optimizing routine workflows in semiconductor testing, he added. By automating rule creation and calibration tasks on test floors, AI could reduce human intervention while maintaining precision. He conceded, though, that workflows often require a balance between AI-driven decisions and expert oversight for reliable, accurate system performance.
Semiconductor testing is not a siloed discipline, Ming cautioned. It encompasses the entire lifecycle of a product from design through manufacturing and in-field operations. AI solutions designed for testing must integrate across these stages to provide holistic benefits. Improved connectivity and predictive modeling will enable manufacturers to deliver products faster with higher quality and lower costs.
Leave a Reply