Move to 10/7nm and beyond creates need for more precise metrology and alignment within different chip layers.
The overlay metrology equipment market is heating up at advanced nodes as the number of masking layers grows and the size of the features that need to be aligned continue to shrink.
Both ASML and KLA-Tencor recently introduced new overlay metrology systems, seeking to address the increasing precision required for lines, cuts and other features on each layer. At 10/7nm, there may be 80 or more masking layers, versus 40 at 28nm. And if those layers are not precisely measured, the features being patterned, deposited and etched may not line up from one layer to the next.
The job of the overlay metrology system is to detect unwanted shifts in position between the layers, as well as process variations. This is a critical measurement, as overlay mishaps can impact the performance and yield of a chip, as well as its reliability in the field. But at each new node, it’s also is much more difficult.
“The big challenges come from the tight requirements and new sources of overlay errors,” said Harry Levinson, senior fellow and senior director of technology research at GlobalFoundries. “We need to make improvements for different contributors and determine that we have indeed made improvements for individual contributors. This means measuring precisely to an angstrom or two.”
As a point of reference, an angstrom is equal to 0.1nm. And this is where both ASML and KLA-Tencor see new opportunities cropping up.
“While it may seem that ASML and KLA-Tencor are on a collision course, they are actually approaching the overlay market from different directions,” said Bob Johnson, an analyst with Gartner. “ASML is using it in conjunction with its services model to help clients improve their lithographic processes and make their tools yield well at the leading edge. KLA-Tencor is using overlay as part of a suite of products that address all of the process issues which effect litho.”
In 2016, the overlay metrology equipment market was a $350 million business, according to Gartner. It is expected to grow about 12% in 2017, according to the firm. KLA-Tencor is the leader in terms of share with about 70% of the market. ASML and a few others compete in the segment.
Why is overlay critical?
In chip manufacturing, the goal is to align the various layers of a wafer in a precise manner, which represents good overlay. “For example, a transistor gate on one layer needs to be connected through a contact in another layer and to an interconnect wire in another layer. They all have to be lined up on top of each other,” said , chief executive of D2S. “Since (designers) know that overlay is not perfect, there is tolerance built into the design rules both geometrically and in the performance characteristics. Transistor performance, in particular, is designed to rely on critical dimension widths, but is tolerant to minor variations in edge placement. To scale standard cells smaller with each node, overlay accuracy needs to scale along with the feature sizes.”
Obtaining good overlay starts with lithography. The goal of a scanner is to not only print tiny features with fine resolutions, but also to precisely pattern them. Overlay is the ability of a lithography system to print accurate features on each layer exactly where they’re supposed to be. To accomplish that, tiny alignment marks are placed on both a wafer and a photomask. Then, in a lithography system, a wafer stage and reticle stage aligns the appropriate marks with one another.
The scanner then prints a feature. This process is repeated 100 times or more to expose one mask layer on a wafer.
Lithography systems are capable of printing features with a certain overlay accuracy spec. One spec, dubbed single-machine overlay (SMO), involves the overlay accuracy between two layers printed on the same system, according to ASML.
At each node, chipmakers require better overlay accuracy. At 130nm, the SMO for a lithography tool was 40nm. In comparison, the latest 193nm immersion systems have an SMO of 2nm and below—at least on paper. But in reality, 193nm lithography reached its limit at 80nm pitch (40nm half-pitch). So starting at 22nm/20nm, chipmakers began using 193nm immersion with multiple patterning.
“For any given wafer layer for critical dimensions, particularly with 193i lithography, multiple masks are used to expose the patterns,” D2S’ Fujimura said. “The problem is exacerbated with more masks per wafer layer, like with quadruple patterning that soon will be required without EUV.”
There are other issues. “Overlay has always been a massive problem. However, in current advanced technology, we’ve made the problem even more challenging with multi-patterning,” said David Fried, vice president of computational products at Coventor, a Lam Research Company. “Many of the most difficult overlay challenges exist where layers (masks) need to align to self-aligned quadruple patterning (SAQP) with prior levels. The pitches of SAQP layers are getting very small, so aligning an edge of a later level to those pitches is really challenging. Also, SAQP layers present shapes that are not necessarily perfectly aligned to the design shape, due to effects like pitch-walk. So aligning a single mask to a SAQP layer may drive different errors to different shapes. It’s very complex, and getting significantly more difficult as we scale from 7nm to 5nm.”
The other effect related to overlay is stress and in-plane distortion. “Structures are getting more 3D, and the material systems are more complex,” Fried said. “These lead to more complex stress/strain effects on the wafer. This is especially true in the massive stacks used in advanced flows. In-plane distortion can cause different areas of the die to physically shift in different directions, making subsequent mask alignment even more challenging. There are improvements to be made here in materials and integration, but also in lithography and in-line process controls.”
Multiple patterning significantly increases the complexity of a create a chip. For example, a 28nm device has 40 to 50 mask layers. In comparison, using immersion/multi-patterning, a 14nm/10nm device has 60 layers, with 7nm expected to jump to 80 to 85. 5nm could have 100 layers.
A device with 100 layers is perhaps impractical. Aligning these layers with good overlay is challenging and expensive.
Fig. 1: Chips with multiple layers. Source: Applied Materials
So at 7nm and/or 5nm, the industry hopes to insert EUV. In theory, EUV reduces the number of layers in devices, but it also presents some overlay challenges. “With EUVL, we have contributions from mask non-flatness and mask 3D effects, which were not significant in optical lithography,” GlobalFoundries’ Levinson said.
Once EUV is inserted, the EUV scanners will likely be used for the metal one/two (M1/M2) layers on a device. On the same device, a chipmaker will use 193nm immersion systems for other layers.
So in a mix-and-match environment, it’s critical that the systems have a precise cross-matching overlay. That’s just the tip of the iceberg in terms of the overlay challenges at advanced nodes.
Getting over overlay
Traditionally, after the lithography tool has patterned the features on a wafer, the substrate is sent to the overlay metrology tool for evaluation. Overlay metrology involves a process of taking a multitude of measurements between two (or more) different layers. In simple terms, the overlay metrology system is taking various measurements, such as tool alignment accuracy, wafer/mask distortions and variation.
Metrologists must find overlay problems before a wafer moves to the next steps in the fab. If a problem is detected by the overlay tool after the lithography step, the wafer can be re-worked. But if a problem isn’t detected, the wafer moves to the next steps in the fab. Later, if a problem is found, the wafer is typically scrapped. The worst case scenario is that a faulty design moves into production.
Overlay metrology is important for other reasons. Eventually, overlay data is combined with the critical dimension (CD) measurements on the device. The numbers are crunched, resulting in a key figure that represents edge placement error (EPE). EPE is the difference between the intended and the printed features of an IC layout.
So, overlay metrology is critical, but it is becoming more challenging at each node. The measurement requirements are mind-boggling at 10nm/7nm and beyond. “A lot of logic companies are starting to look at risk production at 3.5nm on-product overlay. But the target would be to get to 2nm on-product overlay,” said Mark Wylie, product marketing director at KLA-Tencor, referring to the actual overlay on a device.
To put it in perspective, 2nm equates to ten atoms. “So we are really down to atomic-level control,” Wylie said.
In response, ASML and KLA-Tencor–the two main overlay metrology tool suppliers–are looking to solve the challenges, although they are taking slightly different approaches to the problem.
For years, KLA-Tencor sold overlay metrology tools based on image-based overlay (IBO) techniques. IBO uses built-in test patterns, which are located outside the chip for overlay measurements. “In image based, you see the layer that you’ve exposed and the layer underneath. And you look at it with an algorithm and a kernel to figure it out,” Wylie said. “The majority of layers will work with image-based overlay. For some layers, where you want to improve your total measurement uncertainty, then scatterometry-based solutions can be beneficial.”
In fact, at advanced nodes, the industry is moving from IBO to scatterometry, at least for some of the more complex layers. IBO is still used for other layers. Scatterometry is sometimes referred to as diffraction-based overlay (DBO). Used in the industry for years, DBO measures the changes in the intensity of light.
What’s the best solution between IBO and DBO? “A lot of it is an engineer’s preference. Some engineers are comfortable with scatterometry. Some are comfortable with image based,” Wylie said.
Regardless, KLA-Tencor recently rolled out a new overlay metrology tool based on scatterometry. The system, dubbed the ATL, is a standalone product that features a tunable laser technology.
Scatterometry enables precise on-product overlay measurements at high throughputs. However, the technology is more susceptible to process variations. And with scatterometry, the metrology system doesn’t look at the actual device. Instead, the measurements are taken on small objects called targets. Targets are pre-fabricated, diffraction-based structures. The target mimics the behavior of the device.
The targets are located outside the device in what’s called the scribe line, next to the actual device. To measure overlay, you create a film stack with gratings. A grating with a target is placed in the first layer. Then, another grating with a target is placed on top.
The tool shines a light through the stack, resulting in a diffraction pattern. “When there is an overlay error, it creates a distortion in your inspection feed,” he said.
All told, the overlay metrology tool detects a misalignment in the layers or what’s called an overlay error. A mishap with the scanner and mask can cause overlay errors, but they are not the only culprits. Films, processes and stress also contribute to errors. Errors can also crop up in the measurements themselves.
At advanced nodes, though, it is becomes more challenging to detect misalignments and overlay errors. “Multi-patterning has really intensified. With that, it brings a lot of challenges with process integration schemes. Before, you used to see a lot of problems with lithography-related hotspots. Now, with all of the 3D patterning effects, you start to see more challenges in the process domains,” Wylie said.
To solve the problem, KLA-Tencor’s latest tool incorporates a tunable laser technology with a 1nm resolution and a real-time homing algorithm. With the tunable laser, the metrology tool sweeps through the wafer at different wavelengths. It finds the area with the lowest inaccuracy. “That would be the region that you would set your wavelength at,” he said. “Then, you are not susceptible to process variations. So when you have overlay, you can measure it accurately.”
Then, with the homing function, the tool looks for a region that represents the optimal point to measure, which saves time and money. The ATL is part of KLA-Tencor’s patterning control strategy, dubbed the 5D Patterning Control Solution. Using the ATL and other tools, KLA-Tencor’s patterning control systems characterize and monitor the processes in and around the lithography module.
Other solutions
Meanwhile, ASML recently rolled out a new overlay metrology tool. The system, dubbed the YieldStar 375F, is part of what ASML calls its “Holistic Lithography” strategy.
This strategy is designed to improve the lithographic processes. As part of this multi-pronged strategy, ASML attempts to drive down the overlay accuracy on its scanner. The scanner itself incorporates several sensors, which collect data on and around the wafer.
Then, the data is fed into the metrology tool. And finally, the data is crunched in a computational unit. “We are not adding more data points to measure, but we are basically obtaining all the data that you have in and around the scanner from the wafer. There are a lot of sensors on (the scanner), such as for alignment, leveling and more,” said Henk Niesing, director of product management at ASML. “From this data, you can derive a lot of overlay and related data that in the end you can compute. We call that computational overlay. You can predict the overlay rather than trying to measure it.”
Meanwhile, ASML’s YieldStar 375F is an integrated metrology unit using DBO. Integrated metrology is one of two types of configurations in the arena. In integrated metrology, the metrology unit is incorporated into another piece of equipment.
It’s different than a standalone configuration. As stated above, a standalone metrology system resides next to another system. With integrated metrology, the drawback is that it may slow down the scanner. Then, if an integrated metrology unit goes down, the data must be re-routed to a standalone system.
On the other hand, integrated metrology saves floor space in the fab. “The biggest advantage to in-situ metrology is more sampling,” Coventor’s Fried said. “If you can measure inside the tool as the lot of wafers is being processed, you can test more wafers, sample more sites, and gather a significantly larger database of process data.”
As with other DBO-based systems, meanwhile, ASML’s tool uses targets to conduct measurements. What’s new is that the new tool incorporates a continuous wavelength feature.
For today’s complex layers, the industry requires DBO. “You have to measure a combination of overlay in order to compile the final overlay of your device. That complexity has increased with the different mask levels and litho-etch-litho-etch scenarios. This has made overlay more complicated,” ASML’s Niesing said.
It’s also critical to develop good target designs. Otherwise, the tool will detect asymmetries or process effects that impact the measurements. “So you really want to measure overlay and not measure the asymmetries, which are process effects on those targets. They may detect that’s it overlay but it’s not,” Niesing said.
The next big thing is tunable lasers. ASML’s previous model supports 11 wavelengths. “Now, you can measure any wavelength that you have in the range provided by the source,” he said. “There is a lot of new interest in metrology to start combining wavelengths, where your drive down the accuracy further and improve the robustness.”
The system addresses other challenges. “We have basically allowed the customer to be able to measure overlay on multiple layers at the same time. Now, we can measure up to six layers all in one acquisition in the metrology tool,” he added.
After the overlay metrology step, the wafer moves to the etch module for etching. Then, at times, CD-SEMs are used to characterize the device overlay after etch. The idea is to compare the post-litho and post-etch measurements.
The problem? The CD-SEM is a destructive and sometimes a time-consuming measurement technique.
So, ASML has added a feature in its overlay metrology tool, which can measure the device overlay in the post-etch process. The non-destructive technique is faster and more accurate than CD-SEMs, according to ASML. “We have introduced the capability on the YieldStar to also measure post-etch directly on the device overlay. This technique is more accurate than the SEM, is >10x faster and non-destructive, and therefore, suitable for high-volume manufacturing,” he added.
Meanwhile, in a recent paper, Applied Materials and GlobalFoundries described another technique that validates the measurements and confirms true overlay accuracy. This involves an in-line e-beam metrology technique using a CD-SEM in a non-destructive manner. The CD-SEM is used as reference metrology. The technique is called random located SEM overlay (RLSO), which samples select regions across the field, according to the companies.
“The workhorse for high-volume manufacturing remains optical tools,” said Adam Ge, an applications engineer at Applied. “For the critical layers, where overlay budgets are tight, eBeam overlay is essential to guarantee the sensitivity and accuracy.”
Related Stories
Accuracy In Optical Overlay Metrology
The physics by which process variations determine accuracy and robustness of overlay metrology.
Device Overlay Method For High-Volume Manufacturing
Results of develop inspection/final inspection bias characterization, including implementation in high-volume manufacturing, and a look at future directions.
Application Of Overlay Modeling And Control With Zernike Polynomials In An HVM Environment
How to reduce co-linearity and improve overlay stability.
Why EUV Is So Difficult
One of the most complex technologies ever developed is getting closer to rollout. Here’s why it took so long, and why it still isn’t a sure thing.
EUV would be used as cut for M1/M2, only adds to existing multipatterning overlay concerns more severely, with EUV-specific issues. Easier to improve on existing SADP trim-to-mandrel overlay.
Thank you, this is a substantial progress. With all the validation, simulation, advanced analysis we still are far from addressing root cause for this challenging Litho metrics.