Next-generation optical inspection is about more than just sensitivity. It’s about reliably seeing through complexity.
As semiconductor devices continue advancing into more sophisticated packaging schemes, traditional optical inspection technologies are brushing up against physical and computational boundaries.
The growing reliance on 2.5D and 3D integration, hybrid bonding, and wafer-level processes has made it much harder to detect defects consistently and early enough to protect yields. While optical inspection remains central to process control, it is evolving in ways that are challenging long-held assumptions about where and how it can be applied.
For years, the approach to optical inspection followed a predictable pattern — push sensitivity and resolution to keep up with shrinking geometries. These efforts have delivered measurable progress, but they also have introduced new complications that go far beyond simple scaling challenges.
“When feature sizes shrink by half, you need four times the number of pixels in the sensor to cover the same area at the same resolution,” said John Hoffman, director of R&D at Nordson Test & Inspection. “That scaling puts constant pressure on camera technology and data processing.”
Additionally, the shift toward 3D-ICs and other advanced packaging has created inspection scenarios that traditional approaches were not designed to handle. High-resolution optics and advanced illumination often generate overwhelming nuisance counts, while thinner wafers and taller stacks create warpage and depth-of-focus problems that undermine measurement stability.
“In advanced packaging, the challenge isn’t just about resolution,” said Damon Tsai, head of product marketing for inspection at Onto Innovation. “With multiple dies stacked together, sometimes the real question is whether you can see the relevant structures at all.”
These constraints have forced many inspection workflows to rethink handling and fixturing. Equipment suppliers and research groups have turned to multi-channel lighting, infrared and laser-based sensing, and more advanced signal processing to address the visibility and measurement challenges of next-generation devices.
From resolution to usability
For most of the last two decades, improvements in optical inspection have centered on capturing ever-finer details. As nodes shrank and features approached single-digit microns, investments focused on increasing lateral resolution, shrinking pixel sizes, and extending depth of field. The assumption was that better optics and more precise illumination would naturally yield better defect detection and higher yields. In many respects, this approach worked. Sensitivity thresholds improved significantly, and optical systems became more capable of measuring smaller features with consistent repeatability.
However, the shift to 3D integration, hybrid bonding, and wafer-level processes has revealed a more complicated reality. In these environments, simply seeing smaller details is often not enough. High-resolution optics can magnify irrelevant background textures or highlight nuisance particles that are not functionally significant. When devices are stacked 12 or more layers high, or when redistribution layers become more intricate, optical inspection systems can return tens of thousands of potential defect signals per wafer. Reviewing, classifying, and sorting this volume of data can slow production and undermine the productivity gains that advanced packaging is meant to deliver.
“In many cases, maximizing sensitivity also amplifies background noise and increases the number of nuisance detections,” said Tsai. “That tradeoff has made filtering and classification strategies more important.”
Even when optical clarity improves, usability depends on how efficiently data can be interpreted. The problem is not only collecting more images or higher-resolution scans, but also deciding which information is relevant to process control and which signals can be ignored. As inspection systems generate larger datasets, the burden shifts to classification and correlation rather than simply detection.
“You have millions of data points inside every chip,” said Marc Jacobs, senior director of solutions architecture at PDF Solutions. “When you analyze inspection data in a personalized way, you’re not just applying limits. You’re comparing expected values to what’s actually measured. That’s how you improve detection of anomalies.”
The growing reliance on AI and machine learning is one response to this problem. But AI alone does not resolve the underlying challenge that detection thresholds and inspection strategies have to be tuned for different packaging scenarios. A defect that is critical in a fan-out wafer might be acceptable in a less-dense assembly, or vice versa. These contextual factors often require closer collaboration between process engineers and inspection teams to define what counts as a yield-limiting defect. For many manufacturers, this shift from maximum resolution to maximum usability has become the defining test of inspection strategy.
Warpage, thin wafers, and hybrid bonding challenges
The move toward thinner substrates and taller device stacks has introduced an array of mechanical challenges that complicate optical inspection in ways that front-end processes rarely encounter. In advanced packaging, warpage is no longer an occasional nuisance, but a persistent factor that can distort measurements, obscure features, and cause reproducibility issues from one wafer to the next. As packaging density increases and devices migrate to 2.5D and 3D formats, the mechanical stability of the wafer or panel becomes as important as the optical performance of the inspection tool itself.
Inspecting these thin, often flexible structures requires either flattening the wafer or adjusting the optics dynamically to account for uneven topography. Both strategies introduce tradeoffs. Flattening can create mechanical stress or breakage, while dynamic focus tracking demands sophisticated sensing and control loops that can slow throughput or reduce measurement consistency. In some cases, the only viable solution is to combine better mechanical fixturing with advanced image correction algorithms to stabilize data collection.
“As packaging structures grow more complex, it’s no longer enough to just focus and measure,” said Samuel Lesko, head of applications development at Bruker. “You need to remove warpage to the best extent, creating a reference state that stays reproducibly flat, or metrology results won’t be meaningful.”
Engineers have experimented with new materials and vacuum chuck designs to improve handling. These mechanical adaptations are often the first line of defense against distortion, especially as wafers become thinner and more prone to bending under their own weight. A uniform support surface can reduce localized stress points that create measurement errors or even lead to breakage. But even with careful engineering, warpage remains a moving target that requires constant monitoring and compensation.
“Warpage is an issue, and we have to be able to track it,” said Hoffman. “That usually means using some external rough estimate sensor to measure how the surface is deformed, and then adjusting the position of the sensors so it can be measured.”
Ceramic vacuum chucks, for example, can distribute pressure more evenly across thin substrates and reduce localized deformation. But even with careful fixturing, wafer warpage can exceed 100 microns across a single die, which far surpasses the depth of focus range in most conventional optical systems. The result is frequent defocus, blurred measurement edges, and unreliable overlay data.
Decoupling the optical signal from the limitations of the objective lens has become an active area of research. Approaches such as white-light interferometry are being developed to help maintain vertical resolution even when the surface is uneven or tilted. These techniques can compensate for certain types of distortion by separating topographical information from variations in focus. However, they still rely on a predictable baseline shape and known boundary conditions to produce consistent results.
“White Light Interferometry shrinks the depth of field below 2µm independently from optical magnification, maintaining vertical resolution even when surfaces are transparent and uneven or warped,” said Lesko. “That separation is what helps the system distinguish genuine topography from optical artifacts.”
Hybrid bonding introduces yet another layer of complexity. Unlike conventional solder bumps or microbumps, hybrid bonded interfaces can harbor voids and disbonds that are difficult to detect with optical techniques alone. These voids may be only a few nanometers deep, but they can lead to electrical discontinuity or long-term reliability failures. In some cases, optical inspection is combined with acoustic methods or atomic force microscopy to verify bonding quality. The need for nanometer-scale characterization has driven equipment makers to explore hybrid metrology strategies that integrate multiple sensing modes in a single platform.
“Mobilizing atomic force microscopy and optical profiling has become more important as bonding pitches shrink and interfaces require nanometer-scale characterization,” said Lesko. “Consolidating information from both techniques gives you ability to screen defects and to get extensive information on the defective pad and the interface quality.”
Fig. 1: Common process defects of hybrid bonding. Source: Onto Innovation
This balancing act between physical flatness, optical clarity, and hybrid bonding verification remains one of the most persistent challenges in inspection for advanced packaging. Even incremental improvements in fixturing, signal processing, and multi-modal sensing can make a measurable difference in yield and reliability, which is why so many development efforts now focus as much on stabilizing the measurement environment as on increasing resolution.
Multiple illumination and multi-channel approaches
As inspection challenges have multiplied, relying on a single illumination source has proven inadequate for many advanced packaging processes. Traditional brightfield or darkfield lighting often struggles to differentiate between contamination, structural defects, and background roughness in multi-layer stacks. The result can be inconsistent detection or high rates of false positives, especially when the process introduces a mix of organic residues, thin-film voids, and varied surface reflectivity.
To address this, equipment makers have turned to multi-channel illumination strategies that combine different wavelengths, angles, and polarization states. Infrared imaging has become more common because it can penetrate silicon to reveal buried interfaces, although it still has limits when metals are involved. Laser-based illumination has emerged as another option for highlighting residues that would otherwise blend into textured backgrounds. For example, certain organic films left behind during copper-to-copper bonding can remain invisible under conventional lighting, but become detectable with narrow-band laser sources tuned to increase contrast.
“If you use traditional brightfield, the light comes from the top and you see through everything,” said Onto’s Tsai. “When you add multiple angles of illumination, you can reduce the pre-layer noise and make specific features stand out.”
Beyond wavelength, the angle of illumination plays a critical role in reducing nuisance signals. Oblique or multi-angle lighting can suppress reflections from deeper layers or enhance contrast on specific surfaces. In practice, these configurations often require careful balancing to avoid introducing artifacts or degrading measurement consistency. Different illumination channels also need to be matched with corresponding detection optics and filters to maintain adequate signal quality.
While hardware improvements help expand what can be imaged, the complexity of the resulting datasets has grown just as quickly. Multi-channel inspection often generates several times more raw data per wafer than conventional brightfield inspection. This added burden has made embedded signal processing and machine learning essential to separate critical defect signatures from benign variations.
One of the ongoing debates is whether adding more illumination modes will yield diminishing returns. Each additional channel increases cost, calibration time, and the risk of inconsistent results between systems. For now, most engineers working in hybrid bonding and high-density packaging agree that multiple illumination modes have become a necessity rather than a luxury, even if the tradeoffs are still being worked out.
“There’s always been this tension of having an algorithm that can solve a problem but takes too much compute to run at scale,” said Nordson’s Hoffman. “The same applies to illumination: The more channels you add, the more you have to think about data volume and consistency.”
AI and machine learning take control
The steady rise of multi-channel inspection has created a new problem — far more data than any human team can review in a practical timeframe. Even a modest increase in sensitivity or illumination complexity can result in orders of magnitude more raw images and feature maps to classify. As a result, machine learning is no longer a specialized add-on. It has become an indispensable mechanism for turning high-resolution optical signals into actionable information.
The first wave of AI in inspection focused mainly on image classification, applying trained models to identify known defect types and classify them into categories. This approach helped reduce nuisance counts and accelerated the process of verifying whether a flagged feature was truly relevant to yield. Over time, these models have become more sophisticated, learning not only what a defect looks like but how it behaves in different process steps and material stacks.
“Inspection workflows used to require extensive manual recipe setup and conversion from CAD data,” said Charlie Zhu, vice president of research and development at Nordson. “Now we’re using AI to automatically generate those recipes, which cuts setup time and reduces errors.”
A growing number of inspection systems have moved away from centralized computing clusters that process data after acquisition. Instead, some tools embed AI models locally, allowing immediate classification and filtering. This hardware-accelerated approach can improve throughput by eliminating the delays of transferring massive datasets to external servers. However, it also raises questions about transparency and reproducibility. Neural networks are notoriously opaque, and some engineers remain concerned about relying on decision-making logic that is difficult to verify.
“In many cases, traditional algorithms still work better for certain precise measurements because they operate at native resolution,” Zhu said. “Deep learning models often downscale and then rescale data, which can lose some positional accuracy.”
The other challenge with AI is the risk of overfitting models to specific product types or process conditions. When new materials or structures are introduced, training data may no longer represent the variability of production.
“There’s no single model that works everywhere,” said Hoffman. “Even if you train it on a lot of data, it can drift or overfit when the product mix changes or when customers introduce something you haven’t seen before.”
For this reason, many companies have adopted hybrid workflows that combine deterministic algorithms with machine learning filters. This layered approach allows teams to preserve high-confidence measurements while still benefiting from AI’s ability to flag unexpected anomalies or recognize subtle defect patterns.
Despite these limitations, AI-enabled inspection already is proving essential in high-volume advanced packaging environments. The speed and scale of data classification simply would not be sustainable with traditional rule-based logic alone. Most engineers expect that as processes continue to evolve, the balance between deterministic and data-driven inspection will remain a moving target.
Integrating metrology data and process feedback
While improvements in optics and machine learning have expanded what can be detected, there is still a gap between identifying a defect and understanding what it means in context. Inspection data alone does not always reveal whether a deviation will impact device reliability, especially in high-mix production environments where process windows shift frequently. For this reason, companies have invested heavily in connecting inspection systems with process control and design data to build more predictive models of yield.
“Embedding AI directly inside the inspection tool reduces the need to transfer large datasets to external servers for analysis,” said Tsai. “This approach helps maintain throughput while still giving you enough information to feed process control systems.”
Traditionally, inspection outputs were treated as separate records, archived for traceability or used in retrospective failure analysis. Increasingly, manufacturers expect inspection results to feed directly into closed-loop process adjustments. By comparing measurements to statistical process control (SPC) models and design intent, teams can adjust deposition rates, etch profiles, or bonding parameters in near real-time. In some advanced workflows, inspection data also informs adaptive recipe generation, where new process parameters are calculated dynamically based on recent metrology trends.
“Machine learning has advanced to the point where models can carry over knowledge from earlier designs, even when customers are moving quickly between process nodes,” said PDF’s Jacobs. “That helps reduce sample requirements and shortens learning cycles.”
“Machine learning definitely helps reduce the amount of false alerts in manufacturing,” said David Park, vice president of marketing at Tignis. “You’re never going to take them to zero, but you can significantly cut down the time wasted on chasing things that don’t matter, and help people get to the real root cause much more quickly.”
This integration is particularly important in advanced packaging, where known good die strategies only succeed if the definition of “good” remains current. As new materials are introduced and interconnect geometries evolve, reference libraries built on older processes may not reflect actual variability. For this reason, many companies rely on hybrid approaches that combine deterministic SPC models with continuous learning systems that adapt to subtle process shifts.
Linking inspection data with design information also helps improve sampling strategies. Rather than inspecting wafers randomly or at fixed intervals, manufacturers can target high-risk zones predicted by simulation. This selective sampling approach reduces inspection overhead without sacrificing sensitivity. However, it requires careful alignment of design, metrology, and production workflows, a challenge that has kept many organizations from fully automating their feedback loops.
Even among companies making progress in this area, questions remain about data governance and model transparency. Machine learning models trained on proprietary process information can be difficult to audit, and integrating datasets from multiple vendors introduces compatibility and security concerns. For most teams, the practical solution has been to build modular systems that allow deterministic rules and machine learning outputs to coexist without relying entirely on either.
Conclusion: The road ahead
As advanced packaging continues to evolve, optical inspection is likely to face an expanding set of technical and operational challenges. New process variations, such as wafer-to-wafer hybrid bonding, backside power delivery, bi-facial processing, and the integration of heterogeneous materials will push traditional inspection architectures past their current limits. Each of these innovations introduces more variability that can obscure defects and create additional demands on measurement reproducibility. The push toward thinner substrates and taller stack counts will only intensify these pressures.
Warpage, surface deformation, and inconsistent reflectivity are expected to become more difficult to control as line widths shrink and interconnect density increases. At the same time, as AI models are applied to more varied defect scenarios, manufacturers will need clear frameworks for validation and governance. Overfitting, drift, and lack of transparency will remain persistent risks as inspection moves beyond detection into predictive analytics.
Despite these obstacles, the future of optical inspection is likely to be defined by new opportunities rather than incremental improvements. Advances in multi-channel imaging, embedded machine learning, and tighter integration with design workflows are transforming inspection from a static checkpoint into a dynamic yield optimization tool. The next decade will be less about seeing the smallest possible features in isolation and more about connecting diverse data sources to create a clearer, more actionable picture of the entire process.
Related Reading
Advanced Packaging Fundamentals for Semiconductor Engineers
New SE eBook examines the next phase of semiconductor design, testing, and manufacturing.
E-Beam Inspection Proves Essential For Advanced Nodes
Throughput remains an issue. A solution will require a combination of technologies.
Leave a Reply