Big Data In The Fab

The future of factory variation depends on managing data and schedule.

popularity

A modern fab is a very complicated place, with a huge amount of information required to correctly process wafers. But even more data is created to characterize the equipment and wafers. The idea is if there is complete knowledge of the fab then everything should be predictable, including yield, and running it an optimal fashion is possible.

The challenges with big data all revolve around connectivity, standards, and the inevitable issues with security. An interview with Snowden might be interesting here, but he seems to be rather inaccessible. It’s probably worth noting that big data brings big complexity, big costs, and big opportunities.

Snowden jokes aside, secure exchange of data is critical if equipment suppliers and users are going to get to a new level of understanding of variation. Trey Roper, director of sales and marketing for ILS Technology, described in an interview how ILS’ “secureWISE” product enables this exchange. secureWISE provides a secure private global-network that allows the VPN for an equipment supplier and a VPN for a fab, to both connect. Once pre-authorized users are authenticated onto the network, only pre-approved data can be accessed between pre-approved locations within the VPN’s. This means that a supplier can only see specific data from his own equipment in a fab and only execute pre-approved operations.

This product has evolved from a system originally conceived at IBM’s Advanced Manufacturing Systems group, and now has 95% of all 300mm fabs as customers. Roper sees “continuing demand to add more pre-approved commands and data as all parties see the benefits and get more comfortable that their secrets and IP are protected.”

As understanding improves, the task becomes predictive scheduling. Scheduling is getting more and more challenging as lots get smaller, and fabs get more like a foundry rather than a single product memory fab. Equipment availability and unscheduled maintenance are very disruptive to schedules, hence the emphasis on predictive maintenance. To make matters more challenging, new process and devices are being introduced that have to be learnt and managed. Alan Levine, director of marketing at Wright Williams & Kelly, sees these demands reflected in his company’s factory simulation software. He describes the software as quantifying “factory physics.” They then use discrete event simulation to measure how quickly wafers will flow through the factory. “Customers use the software to identify bottlenecks, and test different dispatch rules for a specific goal,” he said. “For instance, the goal might be overall cycle time, or shortest possible “hot lot” cycle time to meet a specific customer commitment, or narrowest possible distribution of cycle time to give all customers predictable delivery.”

WW&K develops as complete a picture of factory physics as possible, and then runs enough “wafers” to cover the entire life cycle of lots. Customers use the simulator to test different dispatching rules for running the fab. His assessment is that “well designed rules can do a very good job of running a fab. “ Customers are saying that they want to know how to recover from complicated machine down scenarios, and how to deal with high product mix, and even more challenging running development lots with more manual interactions along with production. Levine sees the immediate challenge as helping the VP of manufacturing, who is greeted by a new problem every day, answer the question, “Now what ?” It is not yet real-time predictive scheduling but it is getting close.

In the future, WW&K sees the entire supply chain as the target. For instance clients want to know when a lot must start in the wafer fab based on expected delivery of a consumer product to stores. “Being predictive across multiple factories/multiple companies is an area ripe for opportunity.”

Applied Materials is trying a different strategy by linking its dispatch and simulation software, as discussed in a recent article in the company magazine “Nanochip Fab Solutions.” Madhav Kidambi, global product manager, described how “they needed to respond to the problem that their simulations did not run with identical dispatch rules as the actual dispatch software. They provided a link so that rules could be transferred between their Real Time Dispatch or RTD software and their APF fusion simulation.”

The additional advantage is that after optimizing the rules using the simulation, the rules can be updated through the change control process in the Fab. He emphasized “that the RTD software needs to run before every dispatch decision because the state of the fab is continuously changing.” The article gave several examples of the impact of improved dispatch rules on fab output.

Data solutions in a modern fab are being integrated across several different interfaces. There are the interfaces between supplier and fab, interfaces between fab and packaging facilities, and interfaces between real time operation and offline analysis. It appears that the industry is indeed moving toward a big data environment, and working out incremental solutions on the way.



Leave a Reply


(Note: This name will be displayed publicly)