Finding Frameworks For End-To-End Analytics

The chip industry will share data once it settles on formats, labeling, and who owns it.

popularity

End-to-end analytics can improve yield and ROI on tool purchases, but reaping those benefits will require common data formats, die traceability, an appropriate level of data granularity — and a determination of who owns what data.

New standards, guidelines, and consortium efforts are being developed to remove these barriers to data sharing for analytics purposes. But the amount of work required to make this happen is significant, and it will take time to establish the necessary level of trust across groups that historically have had minimal or no interactions.

“Data producers are often concerned that data can somehow be leveraged against them by the downstream users or by their end customers — or that critical IP will somehow leak to a competitor,” said Jay Rathert, senior director of strategic collaborations at KLA. “To overcome these concerns requires both a level of security and appropriate data granularity that respects everyone’s IP and business concerns. Furthermore, when data flows between domains, the benefits must flow in both directions in a balanced way, so that each player is serving both their own interests and the broader goal of systemic continuous improvement.”

This is a non-trivial effort, but the benefits can be significant. “We believe that there are substantial possible advances in efficiency, yield learning rate, and device reliability enabled by bridging data across traditional domains and applying modern machine learning and advanced analytics,” Rathert said.

One of the challenges is that data generated at any step in IC design and manufacturing is intended for use by the engineering team associated with that step. But with increased complexity in both semiconductor processes, and co-dependence between design and manufacturing, there is a growing need to combine data from different process steps to make better decisions.

Eli Roth, smart manufacturing product manager at Teradyne, noted that connecting fab and test data can improve yield, lower test cost, speed time to market, and extend the ROI of test equipment. “Thinking about how the wafer fab engineers are using data, there is considerable interest in how to feed-forward results of process steps to improve subsequent steps,” he said. “This can build on the strict repeatability controls in place to improve yield and throughput.”

This is easier said than done, however. Data sharing often is limited by access issues, as well as incompatibilities in data formats from different areas and different standards in reporting. To make matters worse, there is a patchwork quilt of EDA tools, manufacturing equipment, formats that may not support new data, and disaggregation of manufacturing processes across multiple suppliers.

The good news is that recent efforts to develop and adopt new standards, or extend existing standards, are beginning to address these challenges. Many companies are discussing the impending need to share data when heterogenous device integration among different suppliers of chiplets becomes mainstream.

“There are many companies integrating silicon designed and manufactured elsewhere,” said Dave Armstrong, principal test strategist at Advantest America. “I wouldn’t call these ‘partnerships.’ They’re more of a supplier/purchaser relationship. The industry is learning how to better work together in this fashion. The one piece to this puzzle that still needs work is around data sharing. This remains a challenge.”

Along with the need for improved or new standards, there is a recognition that standards alone cannot solve all the issues. Interoperability between data sources requires common naming conventions.

Necessities in data sharing
The ability to share data generated at one point in the value chain or supply chain to another point is essential for end-to-end analytics. If the energy required to share data is too high, it either isn’t done or it becomes an onerous and costly task. Effectively sharing data requires several key elements:

  • Communication from equipment/design tools to outside world;
  • Connected data, which includes an IC device’s design, manufacturing data, and system performance data;
  • Data ownership, including who generated that data, who has access to it, and who actually owns it.

Fig. 1: Key elements to sharing data. Source: A. Meixner/Semiconductor Engineering

Fig. 1: Key elements to sharing data. Source: A. Meixner/Semiconductor Engineering

A clear understanding of the different scenarios engineers face when combining data helps point toward areas where standards, interoperability, or business relationship guidelines can lower the barriers that impede progress toward increasing yield, improving quality, and lowering cost and risk.

Communication
To support data sharing a communication format needs to provide access to the data of interest, and it must support interoperability with other tools/networks in the supply chain. Multiple industry experts say existing standards cannot keep up with newly available data, or the needs of smart manufacturing and silicon lifecycle management. This is a byproduct of established standards being designed to service specific engineering requirements. With more data to manage, and the need to use it for more advanced/complex decisions, current standards actually limit progress.

“At SEMI we drive traceability and automation standards such as equipment data communication to the factory,” said Boyd Finlay, process engineer at GlobalFoundries. “In some cases, we have not yet enhanced existing standards enough regarding equipment trace data automation and ancillary sub-fab trace data comms to main tools. If we had these items, we could engineer out some of our lagging problems. For example, today we lack the appropriate trace data from many critical tool components.”

For decades, test program engineers have relied upon the STDF file format, which is inadequate for today’s use cases. STDF files cannot dynamically capture adaptive test limits, and they are unable to assist in real-time decisions at the ATE based upon current data and analytically derived models. In fact, most data analytic companies run a software agent on the ATE to extract data for decisions and model building. With ATE software updates, the agent often breaks, requiring the ATE vendor to fix each custom agent on every test platform.

Emerging standards, TEMS and RITdb, address these limitations and enable new use cases.

However, standards are not required to address all communication needs. In some situations, providing an API suffices, as it enables communication between source and destination. This holds true for EDA tools, as well, which is important because chipmakers use EDA tools from more than one vendor, as well as their own tools.

But with a huge amount of data available in manufacturing settings, an API may be the best approach for sharing sensitive data from point of origin to a centralized repository, whether on-premise or in the cloud.

Connecting data sources
For any data analytics solution/application, the lack of traceability of a product and consistent data names and granularity are common barriers for connecting disparate data.

“End-to-end die traceability remains a challenge,” said KLA’s Rathert. “Despite advances such as ECID (electronic chip ID) and barcodes for higher-end chips, some older or smaller die identities can become uncertain after singulation. This makes it difficult to pair specific historical manufacturing data to the die’s performance at test.”

Multiple ID technologies are available to address the product traceability for smaller die and semiconductor technologies that cannot support in-circuit identifiers. With in-field applications for enabling new functionality and extracting performance data comes a heightened interest in security for an IC’s traceable identity. Regardless of the technology, the industry needs standards and best practices that spell out traceability requirements.

Data granularity comes down to the hierarchical nature of data. For example, with complex wafer processing equipment, it is no longer sufficient to track the equipment in a wafer/die’s equipment genealogy. Knowing which chamber a lot was processed in is equally important. In addition, parties must agree to consistent granularity for defining an object. For instance, a product name can be a mask set, a test program name, or a process step, among others. That confusion can be especially troublesome when moving a product from the wafer foundry to an OSAT.

“The biggest barrier is data metatags,” said Melvin Lee, applications engineering manager at Onto Innovation. “For example, metatags in the front-end may differ from metatags in the back-end. Front-end fabs use lot/wafer context, and die context is really dependent on how stringent the fab wants to be to ensure die granularity and coordination between systems. Back-end fabs (or OSATs), due to being a low-cost manufacturing process, do not strictly implement such data context standardization.”

Data ownership
Data ownership issues are complicated by security needs and the nature of business relationships.

“Who owns data is a very fluid and situational-dependent question,” said Mike McIntyre, director of software product management at Onto Innovation. “It can have multiple answers, depending on perspective. For example, a foundry may own defectivity data that falls on a wafer, but defect images may be owned by the device designer, and the tool’s FDC signals could be part of the tool’s OEM proprietary data set. At this point there is not a single owner or guidelines for data ownership in place across our industry.”

Data ownership issues take an interesting slant in EDA.

“You obviously own your design data,” said Michel Munsey, senior director semiconductor solutions at Siemens EDA. “Now, when you start analyzing the design you use certain tools to analyze the data. The EDA vendor states the data we’re presenting to you is yours and it is proprietary, and therefore can’t be shared with other tools. In other words, they purposely keep other companies’ tools from being able to analyze that data by saying the report files, the log files, the databases themselves are proprietary. Then you have a similar ownership issue when you hand off to different parts of the supply chain. Who owns the process information? Who owns the yield information?”

Reverberating throughout the supply chain is a recognition that data is valuable. It’s more valuable when combined to other data, but there is a continued reluctance to share.

“The data ownership is indeed an important issue for fabless companies, but it needs to be looked at in the context of whether we are talking about equipment data or device data,” said Sonny Banwari, vice president of ACS business development and operations at Advantest America. “For some of the major OSATs, they have a combination of testers owned by OSATs, while there are other testers consigned by end customer. The availability of real-time device data to fabless customers may expose an OSAT’s equipment health and operational practices to supply-chain scrutiny, which at first glance feels uncomfortable. However, the reality we are seeing is the OSATs that are embracing transparency are the ones leading in terms of enabling end-to-end analytics to customers. In short, the leaders have discovered the benefits far outweigh the alternatives.”

Standards, APIs, guidelines, and consortiums
The issue of data ownership needs to be addressed. As Tom Katsioulas, GSA Trusted IoT Ecosystem for Security chair, noted “At the end of the day, we’re living in a disaggregated value chain that needs to be reaggregated in a digital data thread that will enable analytics.”

Lowering the technical barriers requires extending existing and developing new standards. But standards can take two to seven years to develop, which can impede progress. And with legacy systems, waiting for everyone to upgrade to newer systems to use a new format represents an impediment to the whole value/supply chain.

For some situations having an API can suffice, either temporarily or for the long term. Still, the real need is to address interoperability between data systems to support end-to-end analytics. This requires a balance with respect to the solution and standards that can support a multi-vendor environment, be it manufacturing, design, or silicon lifecycle management.

“It’s a bit like a sandwich,” said Michael Ford, senior director of emerging industry strategy at Aegis Software. “You’ve got solution and standards. Solution includes the things that can be measured, can be reported, can be documented, but each one of those requires cost. So that’s your solution at one end. You then have standards, which make all of those different solution providers talk to each other to make that data available for the upper-level solutions, and then the next upper-level solutions. You can’t just define a standard and expect that to be the solution, because everybody who’s making an analytics solution may have different ideas on the algorithms. They may see things as having different levels of importance. In fact, the whole point of their solution is to find those levels of importance. You will have different players doing different things with different sets of data.”

Manufacturing equipment standards need to be extended, or new ones created, to effectively leverage equipment data for real-time and predictive maintenance applications required for smart manufacturing. In addition to equipment automation standards, Finlay noted, “At SEMI Semiconductor Components, Instruments and Subsystems Technology Community, we are actively working at to drive lines of defense measurement improvements in the supply chain quality of critical materials and to improve critical parts cleanliness.”

Semiconductor test data plays a key role in enabling end-to-end analytics. Two emerging standards from SEMI, E183 (RITdb) and A4 (TEMS), aim to replace STDF and assist with data collection, management, and real-time decision making.

“Exchanging test data is like speaking in different languages,” said Mark Roos, CEO of Roos Instruments. “Part of this is due to limits of STDF in expressing results. Another is that test methods have a big effect on the meaning of data with no way to communicate them. This leads to the data exchange involving an interpreter, usually a test engineer. So, to automate this we need to automate the work a test engineer does in preparing data to share. Part of this is having formats that don’t limit the data which can be included. STDF and CSV files are not adequate. RITdb brings a standard to exchanging data, leaving us with how to exchange meaning. The easy approach is to assist the test engineers in finding, ingesting and evaluating data. Of course, this is what we are trying for with RITdb.”

TEMS offers other improvements. “For ATEs, TEMS specifies a tester event messaging for semiconductors,” said Teradyne’s Roth. “This will help with data structure and data cleaning across the industry. It also provides support for stream analytics capability.”

Standards are crucial for traceability. “This traceability issue is a really challenging problem, and you can’t do that as one company,” said John Kibarian, CEO of PDF Solutions. “That’s why the standards bodies are so important. Put the intellectual property in the public domain, then work with the consumers of the data, the analytics, the factories that generate the data, and the equipment providers that ultimately really support or don’t support the standard.”

Two standards, SEMI T23 and SEMI E142, support traceability that can support connecting data for end-to-end analytics. Various data analytics vendors are supporting these standards, in particular for OSAT environments. As requests to support SEMI’s E142 standard grow, it gains even more traction.

“Cimetrix provides the software development kit to the equipment vendors to implement many SEMI standards. E142 is a SEMI standard for assembly traceability,” said Kibarian. “We’ve now seen over the last few months several fabless companies asking their OSAT vendors to use the E142 traceability on the assembly files. Those OSATs were putting out RFPs for the equipment vendors with the need to support E142. They, in turn, asked Cimetrix to support this SEMI standard. We are building that into the product. The next generation software development kit will support E142.”

While standards represent a key pillar in supporting end-to-end analytics, consortium efforts are sorting out best practices and defining a common framework that support interoperability. To move from suggestion to adoption, industry players need to define a template for end-to-end data and analytics that supports the various engineering and business needs. In effect it requires a marketplace.

Fig. 2: Supports for sharing data sources. Source: A. Meixner/Semiconductor Engineering

Fig. 2: Supports for sharing data sources. Source: A. Meixner/Semiconductor Engineering

“There are standards for many of these data hand-off points in our business. But there is no governing body to regulate communications,” said McIntyre. “Consequently, these standards are suggestions, and adoption is entirely between the interested parties.”

But there is plenty of interest. “If that data is made available, everybody can make money,” said Katsioulas. “The question is which party makes the money. If we’re able to create a connected ecosystem via a digital thread, then consumers can request data from their producers to create their own data, which subsequent consumers can utilize. This becomes the incentive for the supply chain to work together. That’s how you build a digital ecosystem for data analytics where everybody can benefit.”

Conclusion
The semiconductor ecosystem relies upon business-to-business relationships, partnerships and collaborations. When it comes to the business of sharing data, standards, APIs, consortiums, and guidelines for best practices all play a role in navigating the complexities.

The eventual solution remains a bit cloudy, and most likely it will not be one-size-fits-all. What is clear is that industry players recognize that without a framework to share data, everyone will be adversely impacted.

“Many problems in the semiconductor industry are not generated in just one place. Fundamentally, there are links between those different domains,” said Paul Simon, group director of silicon lifecycle analytics at Synopsys. “That’s why it’s becoming more and more important that those things get linked. Otherwise, it becomes impossible. That’s also why the world is changing, because there’s a necessity. It’s not a ‘nice to have’ anymore.”

References
The Semiconductor Components, Instruments, and Subsystems (SCIS) Technology Community, SEMI

Sun, Michelle, “Standard for Substrate Mapping Recently Updated,” SEMI.

Skvortsova, Inna. “Smart and Efficient Supply Chain Integration with SEMI Standards,” SEMI.

Related Stories
Finding Frameworks For End-To-End Analytics
The chip industry will share data once it settles on formats, labeling, and who owns it.

Enablers And Barriers For Connecting Diverse Data
Integrating multiple types of data is possible in some cases, but it’s still not easy.

New Data Format Boosts Test Analytics
RITdb facilitates complex testing and enables new test floor analytics

Fabs Drive Deeper Into Machine Learning
Wafer image interpretation can impact yield and throughput.

Making Test Transparent With Better Data
How the new test data standard can make the test floor more accessible in real-time.

Data Issues Mount In Chip Manufacturing
Master data practices enable product engineers and factory IT engineers to deal with variety of data types and quality.

Infrastructure Impacts Data Analytics
Gathering manufacturing data is only part of the problem. Effectively managing that data is essential for analysis and the application of machine learning.



Leave a Reply


(Note: This name will be displayed publicly)