Digital Twins Target IC Tool And Fab Efficiency

Virtual representations will improve performance and productivity across the entire design through manufacturing flow, but deployments will vary in effectiveness and timelines.

popularity

Digital twins have emerged as the hot “new” semiconductor manufacturing technology, enabling fabs to create a virtual representation of a physical system on which to experiment and optimize what’s going on inside the real fab.

While digital twin technology has been in use for some time in other industries, its use has been limited in semiconductor manufacturing. What’s changing is the breadth, depth, and accuracy of these virtual sandboxes, especially for leading-edge processes. Inside the fab, a digital twin can be used to improve scheduling, resource management, and to respond to production anomalies, as well as to experiment with different options for how to optimize performance and efficiency.

Many industry experts believe innovative digital technologies will be required for the global chip industry to reach its $1 trillion target in the 2030 time frame. Digital twins are an important element of that strategy, which explains the steep investments in that technology.

“Digital twins are basically virtual replicas of physical equipment, and they serve as a bridge between the physical and the digital realms,” said Patrick Pannese, vice president of Strategy and Business Development at PDF Solutions. “This enables manufacturers to simulate, predict and optimize equipment performance before it’s even built or deployed. So there’s a huge benefit to that.”

As with all leading-edge technologies, there are enormous challenges to overcome. Creating virtual digital models of complex semiconductor designs, process tools, and flows requires the ability to accurately map everything that needs to be included in a digital twin, and to be able to make adjustments at a higher level of abstraction. Given the complexity in a chip or fab process, this is a non-trivial effort.

Still, the upside of digital twins is significant enough to justify those investments. For a fab, benefits include better predictive maintenance, improved wafer and packaged device yield, and better scheduling and throughput. Digital twins also can help expedite the ramp-up for new fabs.

“Digital twins have to happen because of the limits of what can be done in terms of gaining efficiencies going forward,” said Sameer Kher, senior director of product development for systems and digital twins at Ansys.

“New device architectures and materials are coming faster and faster, and semiconductor manufacturers must ramp to mature yield even more quickly to make the economics work,” said Joseph Ervin, senior director Semiverse Solutions at Lam Research, who presented at a recent digital twins workshop organized by SEMI. [1] “With virtual materials and equipment, experimentation can be drastically faster, less expensive, more accessible, richer in data, and reduce the cost and waste of materials. Among our customers, digital twin infrastructure is typically applied first to problems that generate the best return on investment. To get the most benefit, digital twins should first be committed to problems that will incur the highest fabrication and testing costs in the search for a solution. It is important to assess the cost of silicon-based testing against the purchase of a digital twin.”

Fig. 1: Digital twins are purpose-driven replicas synchronized to real tools like APC. Source: Lam Research

Fig. 1: Digital twins are purpose-driven replicas synchronized to real tools like APC. Source: Lam Research

Why digital twins now?
Though the concept of digital twins has been around for roughly two decades, the ability to deploy them in a cost-effective way has only become feasible with the availability of inexpensive and nearly unlimited compute resources. Ironically, those compute elements are based on some of the same technology being optimized by digital twins. “It’s really only recently, in the last five years or so, that the key building blocks for digital twins, IoT, machine learning, and compute have allowed digital twins to become a reality,” Kher said.

So it may seem strange that for all its technological sophistication, the semiconductor industry lags other industries — aerospace, automotive, 5G networks, and general manufacturing of goods — in adopting digital twins. There are multiple reasons for this, including technology node changeovers every few years, but the main culprit appears to be the sheer complexity of device manufacturing. Unlike a typical assembly line manufacturing processes, fab line work-in-process (WIP) features various re-entrant process flows, monitor wafers, split lots, etc., and these dynamics make very accurate scheduling and forecasting an ongoing challenge.

“I expect digital twins to enhance manufacturing’s ability to generate even more accurate forecasts,” said Mike McIntyre, director of product management, software at Onto Innovation. “A properly representative operational digital twin will better respond to the intermittent and cascading impacts of WIP disruption in the manufacturing environment.”

Digital twins play into larger goals of improving on-time delivery of electronic products. “You have a digital twin of the fab or factory but if you go one level higher on top of that you have ERP, and that system request’s data in a certain way to create a digital twin. There is also the whole digital twin aspect of running the enterprise, running the factory, staffing the factory too, said Ranjan Chatterjee, vice president of Smart Factory Solutions at PDF Solutions. “Here, you actually get real time data that reflects how the enterprise is working, where your product is, and when it will ultimately be provided to your customer. For example, we are working with SAP on a digital twin of the enterprise connecting the shop floor to the top floor. For that to happen, a lot of this data that exists in silos have to be aggregated, transformed and then presented in a uniform way.”

In addition to helping chipmakers and tool suppliers use a virtual twin to eke out greater operational and supply chain efficiency, digital twin architectures could help with a key industry challenge — data sharing. “If I go into someone’s manufacturing facility, and I look at a machine and feed it a wafer, I see all the settings and I see the recipe,” said Mark da Silva, senior director of SEMI’s Smart Manufacturing Initiative. “That is the data the manufacturers do not want to let out of that company. But if they had a digital twin of it, they could sufficiently encrypt that information so that it is not shared. Digital twin technology makes that possible, and it goes back to the architecture and framework.”

The need for a standardized framework was just one of the key takeaways from the digital twin workshop. A common framework will enhance interoperability of tools among different suppliers, which will help reduce digital twin development and deployment cost and help ensure security. Participants also highlighted the need to agree on digital twin definition and taxonomy. The Chips for America Act has allocated $200 million over five years to create a manufacturing institute that will create digital twins for semiconductor manufacturing, packaging, and assembly, and the validation of digital twins in a physical prototyping facility. It also will include DT use for workforce training.

Starting points
One of the questions that arises in these discussions is whether the industry is already using digital twins. This is probably because the goals of digital twins — reduced scrap, less re-work, fewer process excursions, better predictive maintenance, higher yields — already are being addressed by existing forms of process simulation, visualization and software under the umbrella of design for manufacturing, smart manufacturing, and Industry 4.0 solutions.

Supika Mashiro, director at TEL and co-chair of the IRDS Factory Integration, put this in perspective at the SEMI workshop. She explained that virtual metrology uses digital twins of metrology, run-to-run control uses digital twins of processes, and predictive maintenance uses digital twins of component degradation.

These first examples of digital twins help improve the performance and efficiency of tools. They incorporate lots of data from the wafer processing, sub-fab components or assembly tools, and take advantage of massive amounts of tool- and wafer-level data.

There also needs to be a strong dose of reality mixed in with digital twin development. “The starting point is what’s practical, and that begins with what’s the use case,” said Ravi Subramanian, general manager for the Systems Design Group at Synopsys. “Being able to do everything and create a digital twin for a system of systems is a holy grail. You can do one for a system of systems, but what’s your use case? The modeling fidelity needs to be such that, in a practical amount of time, you can answer the question about the use case.”

A digital twin is a dynamic virtual model of a physical system, typically in software, which designers, engineers and systems providers use to test what-if scenarios far less expensively than when performed directly on the system or tool. To be effective, it must be synchronized with the actual system in a timely fashion.

AI and machine learning tools that already use sensors to predict tools changes will become even more powerful in a digital twin setting. “You can model how a process and equipment evolves over time. For example, in a PECVD chamber, the physics and chemistry of the process changes, so an AI model can react to what its seeing while also predicting what the chamber is going to look like for the next run and tweak the input parameters,” said Jon Herlocker, CEO of Tignis.

The argument for digital twins is straightforward. “A good analogy is an aircraft flight simulator,” said SEMI’s da Silva. “Why do companies use flight simulators to train pilots when they can just put them in the cockpit of a plane? The same applies here. You want a very accurate process flow simulator before you run it in the fab, and it comes down to efficiency and the cost of building this process simulator. These are questions the industry has to answer collectively.”

Because all manufacturing scenarios cannot reasonably be addressed, prioritization is essential. “There are so many parts of the ecosystem, including leading-edge nodes and mature nodes, fabs and packaging, flex, and you can’t build process flows for all of them,” da Silva said. “You have to pick and choose. There’s also the national interest, with the U.S. wanting to bring manufacturing back. So which areas should we focus on? Those are all valid questions that I don’t think anyone has particularly good answers for yet.”

Connecting digital twins
In terms of overall structure, there is no single fab digital twin. Instead, there is a hierarchy of digital twins that ideally are connected and interact as needed. This is more easily said than done. For instance, a run-to-run controller on one process chamber might want to communicate with a similar controller on another chamber to achieve chamber-to-chamber matching. Or an engineer might wish to connect some run-to-run control data to the MES level scheduler (above the equipment level) to better match yield goals with dispatching. Or, dispatching data communicates up to the ERP factory planner. Even higher levels of abstraction occur from fab-to-fab or fab-to-assembly operations.

Digital twins also can be predictive. For instance, they can make suggestions about when alternative control mechanisms should be considered. “I was talking to a customer and they were describing a runaway situation on APC systems where they keep doing incremental tweaks, but these tweaks keep going in the same direction. And at some point you can’t tweak it anymore,” said Michael Munsey, Vice President Semiconductor Industry at Siemens EDA. “A good set of digital twins and simulation can actually spot these situations that are heading toward runaway, recognize that you won’t be able to continue tweaking too much in one direction, and suggest other optimizations that need to happen to bring the process back within spec.”

Digital twins have the potential to make better use of AI/ML models. Unfortunately, there is a lag time of days or weeks to validate predictions for the most powerful models. So there is a need to better manage this gap between prediction and confirmation (see figure 2). Additionally, planned tool downtime for preventive maintenance, chemical or filter changes, as well as unplanned downtime, may be better managed with a targeted digital twin.

Fig. 2: Validation of models is a critical step, but it must be delivered in a timely manner. Source: Onto Innovation

Fig. 2: Validation of models is a critical step, but it must be delivered in a timely manner. Source: Onto Innovation

Tool-level developments
With digital twins, it’s important for the physical tool to be synchronized with the virtual digital tool. A digital twin of a fab can be built only when all its process tools — scanners, etchers, deposition tools (PVD, CVD, epi), etch, CMP, ion implantation, wafer cleaning, stripping, inspection, etc. — exist and can be synchronized with the fab-level digital twin.

“A semiconductor manufacturer like Intel has a flow already,” said da Silva. “I could mimic that flow without the dependency on the equipment level. But that’s very tough to do. If you’re doing a deposition step, the equipment used to make that deposition step needs to be modeled at a certain level, to be able to have faith in the output of the digital twin of that step. There’s a need for a full process flow digital twin. Then they have to make an equipment-level digital twin to support it.”

It appears that leading-edge process tools in fabs will be the first to implement digital twins. “The better manufacturing facilities will link operational forecasting with yield and performance forecasts of materials being produced, enabling a ‘true’ unit completion/unit delivery output for the factory,” said Onto’s McIntyre.

But not all equipment manufacturers have the breadth to develop, for instance, digital twins of their systems. “A lot of tier two and tier three OEMs don’t have the resources to actually bring in data scientists. Getting the benefits of digital twins is a matter of combining industrial hardened AI/ML algorithms, having the data scientists, and having people with 30 years of experience on the semiconductor equipment side to identify, for instance, which PID [proportional integral derivative] control loop I should be looking at for a specific process outcome,” said PDF’s Pannese.

Such levels of control extend to EDA tools that interface with numerous fab processes. “[A digital twin] uses real-world data, with simulation combined with data analysis, to enhance understanding and validating of the product behavior over its lifecycle,” said Sassine Ghazi, president and CEO of Synopsys, during a presentation at the Synopsys User Group (SNUG). “Electronics digital twins already play an important role in modeling our systems, accelerating development cycles, and continue to deliver value after the products are deployed in the field.”

Jensen Huang, president and CEO of NVIDIA, echoed that perspective at SNUG. “Where the system starts and ends is just completely amorphous now,” said Huang. “We need to build the entire chip, which is this entire system, in silicon, in digital twins. And when we say, ‘Hit enter,’ we need to know that it’s perfect. It’s already lived inside the simulator, and it’s been living in that world for a couple years. And so when I finally say launch, I know every single bill of material, how everything’s going to get put together, and all the software has already been brought up.”

Digital twin data
To make all of that work as expected, from design through manufacturing, requires a mix of flows. Digital twins need to account for them, from design tools to manufacturing equipment, and for the data that connects all of them.

Some of this is already in the works. “Applied Materials and Lam are among the few big companies that are already doing that,” said PDF’s Pannese. “The equipment manufacturer develops the digital twin as a value add to the existing equipment that they sell, because in order to connect, for instance, a deposition tool to an etch tool to a litho tool, you have to have someone to do the interoperability data from one piece of equipment to the next one.”

Data selection, cleaning, and processing come into play with digital twins, as well. “There are likely a dozen basic needs for deploying a digital twin on a process tool,” said Onto’s McIntyre. “Creating a digital twin system requires a complete and accurate data set to start. Creating this starter data set will likely require cleansing or otherwise massaging actual data so that it will be usable in constructing a digital twin process or operational model. Manually creating an initial data set is then followed up with automation routines that can assemble the requisite data for a digital twin deployment. Part of the automation routines will have to include various methods to overcome dirty or missing data, which is expected in real-world systems. Lastly, prior to any deployment, the user must devise a procedure to automatically verify the digital twin is maintaining alignment and accuracy representing the actual process over time.”

There are also data compatibility issues whenever companies are sharing data across enterprises to create digital twins. “There are lots of issues with data formats, IP, how much of the data is available? Who owns the data? How do you manage your data with other people’s data, whether it’s your supply chain or ecosystem partner or your customer?” said PDF’s Chatterjee. “So all those things have to get resolved. And it’s not just a question of standards. Standards work well when the problem is well defined.” The industry apparently is not at that level yet.

It’s also important to keep in mind the dynamic nature of digital twins. “In the design world, you’re never done with functional verification, you are always doing functional verification,” said Siemens EDA’s Munsey. “I could see a similar situation with digital twins, where you’re always running your digital twin and feeding back real-life fab data into the digital model to keep optimizing your operations based on that real time data, and so you could constantly optimize your operations based on real-time data. You keep building a better and better model based on that data feedback.”

Fixing what isn’t broken?
The downside to the implementation of digital twins is the fact that excellent solutions exist for many applications, for which a replacement may not be required or optimal. “It will be some time, maybe even years, before digital twin systems are fully integrated into APC [advanced process control] solutions. APC solutions today do a very good job at controlling primary and even secondary factors needed to ensure process centering and run-to-run control,” said Onto’s McIntyre.

But he added there is potential for improvement. “Digital twins have a greater ability to deal with and visualize third- and fourth-order effects on processing. The challenge is in verifying and quantifying any improvement with digital twin process control systems that will sufficiently justify the need to change from what is already a successful and proven control methodology,” added McIntyre.

Conclusion
Digital twin solutions appear to be inevitable, but implementation in semiconductor manufacturing may be limited by the complexity of solutions, resistance to change, and the fact that digital twin technologies are still evolving. Nevertheless, companies are open to collaborating and funding digital twin developments, so time will tell how speedy the actual implementation will be.

—Ed Sperling contributed to this report.

Reference
• M. da Silva, “Integrating Digital Twins In Semiconductor Operations,” Semiconductor Engineering, Feb. 22, 2024, https://semiengineering.com/integrating-digital-twins-in-semiconductor-operations/

Related Reading
Fabs Begin Ramping Up Machine Learning
New models can debug processes and boost yield, but there are lots of caveats.
AI/ML Challenges In Test And Metrology
New tools are changing the game, but it will take time and collaboration for them to achieve their full potential.
Shifting Left Using Model-Based Engineering
MBSE becomes useful for identifying potential problems earlier in the design flow, but it’s not perfect.



Leave a Reply


(Note: This name will be displayed publicly)