Removing Barriers For End-To-End Analytics

What’s needed to effectively share data across the supply chain.

popularity

Parties are coming together, generating guidelines for sharing data from IC design and manufacturing through end of life, setting the stage for true end-to-end analytics.

While the promise of big data analytics is well understood, data sharing through the semiconductor supply chain has been stymied by an inability to link together data sources throughout the lifecycle of a chip, package, or system. The reasons range from disaggregation of the supply chain to operational differences and a variety of business agreements. As a result, engineering teams face hurdles combining useful data from design, manufacturing, and in-field use.

The semiconductor industry is aware of these challenges, as evidenced by a panel discussion on AI and big data sharing for improved factory efficiency at the 2022 SEMI Advanced Semiconductor Manufacturing Conference.

But companies are working together, as evidenced in the Global Semiconductor Alliance Trusted IoT Ecosystem for Security (TIES) group, which has the goal of gathering companies to settle on best practices for enabling a trusted digital data thread from chip design to IoT applications.

If all goes as planned, the impact will be felt across the supply chain.

Determining a data-sharing framework
Barriers exist today for both IDMs and foundry/fabless ecosystem. Within an IDM, barriers can include connecting the design, fab, assembly, and test knowledge, which are separated by ‘right to know,’ lack of traceability and interoperability, and the narrowness of an internal organization’s focus.

“The vast majority of data (generated in manufacturing, test, or assembly) was historically intended to be used to evaluate a DUT, or to monitor the operational efficiency, accuracy, and precision of a particular machine,” said Jon Holt, VMS worldwide fab applications solutions manager at PDF Solutions. “The data generated was generally not intended to be used at downstream or upstream steps in the product lifecycle. Those other steps used different machines and generated different data. What resulted were silos of data.”

Those silos are sources of potential efficiency gains. “Legacy fabs and factories have all these silos of data — for example, lithography, etch, inspection, facilities, yield,” said Bill Pierson, vice president of semiconductor manufacturing at KX Systems. “It can be really hard to break down the silos because engineers say, ‘Why should I share data with this other group?’ Well, it really requires upper management and the respective engineering groups to recognize there’s value to sharing data internally and working together. The semiconductor fabs are sometimes slow in making these business changes. Part of that is due to when you are in a factory and it’s running well, why change it? One of the biggest steps upper management can take is to facilitate collecting and sharing data by breaking down the data silos for engineers to use across the organization.”

In the foundry and OSAT ecosystem, data sharing is more complicated. “With the foundry/fabless model we have a disaggregation of the IDM model. The supply chain has been disrupted. It gives us a lot of specialization and a lot of economies of scale,” said Tom Katsioulas, chair of GSA’s trusted IoT ecosystem security (TIES). “But we lost that continuity of what I would call the data thread associated with supply chain. It became much more complex. And, of course, security issues started creeping in. In which case, the question is really, ‘Can you trust the data?’”

To be successful at connecting data from different points of the value chain there needs to be a framework, and this is more realistic in what end-to-end analytics offers. Because it’s not necessary for all supply chain data to be analyzed together because engineers do not need all the data all at once. But they may need access to additional data once they have analyzed an initial data set. Connected data helps drive downstream decisions, and it helps trace the cause of failures issue upstream to the point of origin.

“The data exchanges I have seen are around resolving failures. End-to-end analytics have not been a driving force, even within IDMs,” said Mark Roos, CEO of Roos Instruments. “When this occurs, the data sharing seems to be a back-and-forth request and demand flow. We need a way to emulate this.”

Rethinking business relationships
At the core of data sharing are the business relationships between entities. Within a company, this begins with acknowledging that data one team’s data can be of value to another team. External to a company, parties recognize that to achieve high yield and quality at minimal cost, they need to be readily share data without instances of retribution when things go wrong. In both scenarios, companies need to invest in the infrastructure to facilitate access and analysis of data.

“For IDMs, they control and own all the data, so there are no business barriers for them. For fabless semiconductor companies there are typically two major sources of data — foundry data and OSAT data,” said Greg Prewitt, director of Exensio solutions at PDF Solutions. “Access to product and machine data from these two sources usually is based on the contractual agreements in place between those parties, and these can vary widely from company to company.”

Fundamentally, business relationships need to shift from vendor-buyer competition to one of increasing collaboration.

“The business-to-business relationships and partnerships and collaborations are the most important part of building this semiconductor ecosystem,” said Brian Archer, software engineering director in Cadence’s Digital & Signoff Group. You see this in how Cadence plans to interoperate with any given a cloud provider for these analytics platforms. It could be a direct competitor or a partner to us. We recognize that as part of our enablement we have to play nicely in the ecosystem.”

Others agree. “I don’t see any individual company that actually plays in the entire space, covering all of the manufacturing data, the wafer data, the supply chain data, the in-life data,” said Aileen Ryan, senior director of portfolio strategy for Tessent silicon lifecycle solutions at Siemens EDA. “Everyone has very useful pieces. Some pieces will be more important to some customers than others. And that’s where customer choices are going to be made.”

In a data-driven economy, consumers need trusted data, and providers need financial compensation for their data. Both require secure data.

Consider a fabless design company that wants feedback from a foundry to improve its designs. As part of the transaction, it should be able to provide information to assist the designers. Yet the foundry typically views this information as propriety. Data ownership comes into question even during the design process, when sharing data between different tool vendors.

“Data ownership is the number one issue,” said Katsioulas. “My data-1 is used downstream to create data-2, and further down to create data-3. If data-3 (which contains data-2 and data-1) is monetized, how do you ensure that each data producer gets their fair share of the monetization? Security and trust among partners won’t work. What is needed is to secure and certify the source of that data and make it traceable downstream.”

Concerns about who owns the data and how to keep it secure remain are ongoing challenges.

“Security is a big deal,” said John Behnke, general manager of FPS product line at Inficon. “Nobody wants to share data for IP reasons. Nobody wants to share data at all because they’re just afraid of the security aspects.”

Others agree. “Everyone seems to think sharing data is important,” said Mark Roos. “But as soon as we make it easy (e.g. RITdb), security becomes a big issue and folks become scared or greedy — scared that they may leak some critical data, greedy because they may be missing a chance to charge money for the data.”

Addressing operational barriers
The business of sharing data, and the operational requirements to support that sharing, need to incorporate security, traceability, and interoperability. With the tremendous amount of data generated in the context of different use cases, understanding those use cases up-front can help guide data engineering choices and reduce the burden on data infrastructure design.

“In other industries, focus is on making use of data engineers to manage and prepare the data so others can use it properly. The data engineers’ role is important as the data volumes explode and it becomes ever more critical as the value of data becomes perishable,” said Pierson. He said the most value is generated when users of data can act on data closer to in real-time. “At the end of the day, the goal is to provide tools for the operators and engineers to improve production and realize a greater ROI.”

While wrangling data is a key step in sharing across the supply chain, everyone is concerned with security.

Testing is the hub
The manufacturing test process illustrates concerns and solutions. During testing of wafers and units, the ATE becomes the nexus for interactions between upstream and downstream data that drive decisions and build models. Semiconductor suppliers possess a high sensitivity to having any data about their product being exposed. Consequently, ATE vendors have been creating computing/hardware solutions that support a zero-trust environment.

“We’re productizing a parallel compute resource for use in the test cell that will have a zero-trust environment,” said Eli Roth, smart manufacturing product manager at Teradyne. “This enables exporting data in real time from the tester, running a secure model that’s IP protected, and returning instructions back to the tester. Our customers would be able to generate their own models in a zero-trust environment, or we could use our own models. The location of the compute resource matters because you want that to be really, really fast. From a compute strategy perspective, there’s a balance between data security and performance.”

Advantest offers a data exchange capability that provides a zero-trust environment. In a recent press release the company emphasized that its centrally provided interface collects data in a secure, reliable, and standard way with fine-grained permission control by data owners.

With respect to test data format standards, Keith Schaub, vice president of technology and strategy at Advantest America, noted, “While this continues to be a challenge, much progress is being made, both with integrating potential new standards such as TEMS (Tester Event Messaging for Semiconductors) and RITdb (Rich Interactive Test Database), and with integrating data exchange networks that can plug into customers’ databases.”

Multiple industry experts highlight the challenge with traceability. Without traceability there can be no connection between data, and this becomes particularly evident where names change as a design or product changes hands. That can lead to issues in defining the data granularity associated with a name.

“How do you define a product? Some people say a test program version is a product, because that’s what defines the product in the specifications of the product. Some people say no, a product is a reticle set, because that’s what defines the product on the wafer,” noted Paul Simon, group director of silicon lifecycle analytics at Synopsys. “All these things are semantics. It’s what you need to define. For example, what is equipment? Is it an etching machine or is it actually one of the chambers in that machine? If you want to do a defect yield correlation, and you don’t have traceability back to the chamber level, it’s quite useless.”

This is where standards and best practices fit in. “In operational product lifecycle management, it goes from when you start the design all the way to the end system,” said Michel Munsey, senior director semiconductor solutions at Siemens EDA. “Connecting data from the different parts of the design process, there is design traceability. Names can change, so you need a way to consistently name design entities. For instance, when a design engineer synthesizes a block, they use this set of libraries. But another engineer can use a different set of libraries. They may yield differently. You want to learn from that data, but if there is an inconsistent naming, then the comparison between the two instances is nearly impossible.”

Standards and frameworks
Setting up a data exchange that guarantees security, describes the business relationship, and delivers payment will facilitate the sharing of knowledge. Industry standards can assist in breaking down operational barriers by facilitating standard formats, and consortiums can help in determining guidelines for doing business.

The need for interoperability will drive these efforts because such a focus can result in efficiencies and maintain flexibility.

“The ability to correlate across different elements of the lifecycle, manufacturing data and life data, that’s a whole other can of worms. I hope there will be a suite of standards put in place to enable it,” said Siemens’ Ryan. “Without that happening, you’ll be relying on manufacturers and vendors to enable it on a pair-by-pair basis. With this approach, the industry is going to end up effectively paying an integration tax on a pair-by-pair basis, depending on the toolset the customer wants to use. Ultimately, that burden will become very costly.”

Others agree. “Industry standards help drive certain structures and capabilities in the stored data. Without these industry standards there would be a lot less integration across different data types,” said Mike McIntyre, director of software product management at Onto Innovation. “The ability or the need to drive a standard for the data structures themselves may impinge too much on what makes providers in this space competitive.”

In terms of data sharing within a marketplace the industry needs guidelines for interoperability and security. That could provide the basis for a global query mechanism so that companies can ask for specific information to support their specific business/engineering needs.

“End-to-end analytics requires a common data language that can be collected at every stage, including in the field,” said Marc Hutner, senior director of product marketing at proteanTecs. “Only by correlating to usage data across stages can the industry achieve device learning for higher quality products and assure traceability. This will come with new approaches that enable collaboration between teams and companies. It also will require additional data at the device and production levels to ensure learning across the supply chain.”

Standards are important for data interoperability, but the new business relationships need a framework. “But when you talk about collaboration amongst multiple parties, it’s not so much the standards as it is the business framework,” said Siemens’ Munsey. “It comes down how you interpret the contracts. That’s where a lot of these barriers start coming into place. Instead of having one-off contracts between everybody, negotiating every single detail to a contract, you need standardized contracts. We benefit from a framework that states, ‘This is how we’re going to do business as an industry, this is how we’re going to hand stuff off, this is the type of data that we need to share.'”

“End-to-end data and analytics already exist today,” said Prewitt. “One of the goals of GSA TIES is to provide best practices that the industry can adopt to create their own end-to-end solution, using commercial products where it makes sense for each organization. Standards take a very long time to develop. That is why the GSA TIES initiative is not creating new ones, but leveraging existing ones along with best practices to provide a template for end-to-end data and analytics for the semiconductor and electronics industries.”

Fig 1: Overview of TIES - Source GSA TIES
Fig 1: Overview of the Trusted IoT Ecosystem for Security (TIES) group. Source: GSA TIES

Making this work is non-trivial. “First, you have to have a business incentive to establish your core supply chain monitoring and traceability capabilities,” said Katsioulas. “Note, this is not going to be solved through integration or pure standardization. It will be solved with unified, uniform interoperability guidelines and cybersecurity guidelines similar to the NIST Cybersecurity Framework, which is uniformly applied by all enterprises. Now, those are only guidelines. They don’t reinforce certain digitalization and security processes in a way that everybody does it in a consistent way. The second thing that has to happen is what I call the operating methodology — general guidelines so that every enterprise applies a consistent operating methodology with respect to digitalization, anti-tampering, security, and certification. And finally, there needs to be a consistent mechanism to essentially propagate data from the local storage of the enterprise to the global storages that can be shared in creating the next-generation marketplaces. The result is enabling data sharing and monetizing it.”

Conclusion
Understanding the underlying economic value of sharing data, and the impediments currently in place, can set the stage for the semiconductor industry and the rest of the supply/value chain to define solutions for data sharing across the supply chain.

“We’re at this inflection point that is going to completely change the way we think about the industry, because there is no one provider that’s going to be able to save the day,” said Inficon’s Behnke. “But what it means is we’re all going to have to work collectively in a kind of coopetition. If you have a digital twin, that’s great. But I still need yield data from another source. This is really a complete transformation of the semiconductor industry.”

Related Stories
Where And When End-To-End Analytics Works
Improving yield, reliability, and cost by leveraging more data.

Enablers And Barriers For Connecting Diverse Data Integrating multiple types of data is possible in some cases, but it’s still not easy.

Sharing Secure Chip Data For Analytics
Security practices are evolving to meet sharing data across siloed engineering teams, but they still have a long way to go.

Data Issues Mount In Chip Manufacturing
Master data practices enable product engineers and factory IT engineers to deal with variety of data types and quality.

Infrastructure Impacts Data Analytics
Gathering manufacturing data is only part of the problem. Effectively managing that data is essential for analysis and the application of machine learning.

New And Innovative Supply Chain Threats Emerging
But so are better approaches to deal with thorny counterfeiting issues.

Uniquely Identifying PCBs, Subassemblies, And Packaging
New approaches to preventing counterfeiting across the supply chain.

References
NIST Cyber Security Framework
https://cybersecurity.att.com/resource-center/solution-briefs/nist-compliance-usm-anywhere

SEMI E142 standard
https://www.semi.org/en/standards-watch-2020Sept/revision-to-semi-e142

Introduction to GSA TIES
https://www.gsaglobal.org/iot/ties/ties-introduction-presentation/



Leave a Reply


(Note: This name will be displayed publicly)