Sharing Secure Chip Data For Analytics

Security practices are evolving to meet sharing data across siloed engineering teams, but they still have a long way to go.

popularity

New approaches and standards are being developed to securely share manufacturing and test data across the supply chain, moves that have long been considered critical to the reliability of end devices and faster time to yield and profitability.

It will take time before these methods become widespread in the IC supply chain. But there is increasing agreement these kinds of measures are essential to deal with rising complexities in manufacturing processes and the increasingly interrelated steps, which have ballooned with the slowdown in device scaling and a reliance on increasingly heterogeneous architectures and advanced packaging.

The challenges for sharing “just enough data” to “just those who need to know” are not new. But practices that have been in place for the past two decades have not really changed, and that’s causing problems. Many engineering managers still view IC manufacturing data as proprietary and highly confidential, and fear of competitors getting their hands on this data remains a persistent concern.

Nevertheless, this data can provide insights to customers and partners about product yield and quality, as well as factory operation metrics such as uptime and throughput. It’s particularly important because the supply chain is so complex that no one company, engineering team, or engineer has all the data in their hands. Even within an IDM, data silos exist due to “need to know requirements” for access to sensitive data.

When a crisis occurs, data can be shared, but it often is done in a painful and ad-hoc manner. With multiple players in the supply chain, and the supporting equipment vendors, the ability to share data impacts the ability of engineers to maximize factory performance and manage a product’s yield/quality/cost triangle. Nearly everyone agrees this is a challenge the industry needs to overcome, and that changes are underway. But those changes are proving difficult to implement.

“This is a big challenge facing the industry. Data is IP,” said Keith Schaub, vice president of technology and strategy at Advantest America. “This is getting better as new proven, highly secure solutions and protocols are being adopted and integrated into the ecosystem.”

At least part of this is being driven by the fact that data is needed pre-manufacturing to develop design for test and design for yield strategies, while field data needs to be used to monitor devices post-manufacturing and looped back into the fab or OSAT.

“Data collection, storage, and management is increasingly important to our customers in yield and operations — especially when running volume scan diagnosis to help improve yield,” said Matt Knowles, director, operations product management at Siemens EDA. “If you look across the industry leaders, there is no one-size-fits-all solution for how this data is managed. The most successful companies enable secure data sharing across their suppliers to make a best-of-breed solution work best for them. In the past few years, we have seen in increase in collaboration and cooperation throughout the extended supply chain. This includes ATE providers, EDA companies, YMS (yield management systems) companies, and cloud services. We expect to see more of this collaboration in the future.”

Still, the proprietary nature of everyone’s data creates barriers that impede sharing with others. To continue the development of collaboration, the security aspects of sharing data need to address the management of data access, as well as security.

“As I learned about the semiconductor industry and its data management practices, I was shocked by the haphazard data accountability measures,” said Brian Christian, CTO of Flying Cloud Technologies. “When it’s created, who created it? Who is consuming it? And where is the data traveling? Just because the data pipe is secure doesn’t mean the two endpoints that the pipe is going to are secure.”

So focusing on just encryption doesn’t stop data leaks. And without knowing exactly who has access to data and when, a company has an incomplete information security management system. Data security technologies and processes are required to support monitoring and auditing data access to electronic files. Such methods address privacy concerns in the semiconductor industry. The level of automation that can support these privacy concerns can facilitate engineering goals of optimizing test flows, maximizing equipment uptime, and speeding up yield learning.

Value in data sharing
Sharing data between business partners within the semiconductor industry can have enormous upside, such as unscheduled equipment maintenance, and predictive modeling based on aggregate test data and physical design descriptions.

The challenge is to overcome existing contractual agreements and perceived proprietary nature of all data. In scenarios where data sharing has been shown to work, companies figure out the minimum amount of data that needs to be shared to get the job done.

Consider yield learning from failures in ATPG tests. Within a large SoC, EDA diagnosis tools can isolate that failure to one or several candidate nets. To make use of that data, failure analysis engineers require the physical design files — Library Exchange Format (LEF) or Design Exchange Format (DEF).

For an IDM, there are no barriers to access that data. For fabless companies, though, that data needs to be shared.

“If the foundry goes ahead and wants to do the whole analysis themselves running our tool and looking at the test data, what they’re missing is the design — to know what metal layers the failures could be on they need LEF/DEF files,” explained Guy Cortez, staff product marketing manager for silicon lifecycle management at Synopsys. “The fabless design company has a valid concern in sharing that design information. It’s IP. So the design data needs to be obfuscated — i.e., net names and circuit blocks given arbitrary numbers. The tool also only provides specific design elements. It doesn’t give the whole thing.”

The same kinds of data-sharing issues crop up between the foundry/OSAT and the equipment companies, which sell to multiple foundries and OSATs. Sal Dilorio, principal consultant at Semi-Tech group, said this is a long-running problem, and recounted a casual discussion that led to a much deeper collaboration.

“Semi companies do a lot of their own servicing,” Dilorio said. “I asked, ‘Could you use more information? I could give you a printout of all of the problems discovered from the unscheduled maintenance.’ He looked at me with eyes like saucers. So we set up a non-disclosure agreement stating information about [only his company’s] etchers would be shared. The director eventually signed up about five other companies. At the end of each month, we’d get a report that says, ‘Here’s your distribution of failures. Here’s the overall distribution of failure.’ You’d know where you were versus the whole population.”

This data sharing eventually led to an equipment upgrade that resolved most of the downtime problems. Dilorio said that helped all six companies sharing maintenance data with the equipment maker.

A hesitant industry
Part of the reluctance for such collaboration is driven in some part from the nature of the business relationship. Everyone views their data as precious, and they want assurances it will be treated as such by the other party. Companies that offer yield management system and test measurement analytics hear this all the time.

“The manufacturing data, as well as quality and reliability information gathered during these processes, have direct impact on the company’s business, such as direct cost, warranties, and liabilities, and is one of their key secrets,” said Nir Sever, senior director of product marketing proteanTecs. “It’s not just security. Security is about making sure that the data is encrypted when it is transferred to the software. Privacy is about who is allowed to look at the data once it’s already loaded into cloud on which the software will run. Platform vendors need to understand and respect this, but at the same time build the data security and access control to alleviate these risks.”

Securing the whole pipeline
The two previous scenarios provide examples of the value-add, as well as the problems encountered in the current state of data sharing — narrow data and ad-hoc data exchange. To widen data sharing and to facilitate engineering goals the data pipeline between business entities needs technology upgrades.

Companies that provide yield management systems and test data analytic test platforms all acknowledge their customers require secure systems. Multiple companies often contribute data across the supply chain, each with its own security concerns, and every business relationship establishes the amount of data that can be shared and with whom.

“Hybrid systems will likely be necessary for security and performance reasons for the foreseeable future, where security technologies and certifications (ISO 27001) will be key,” said Paul Simon, group director for silicon lifecycle analytics at Synopsys. “The onus really falls on solution providers to provide the necessary agility, robustness and easy-of-use that is required by semiconductor companies. This includes agility, by way of providing a cloud and on-prem offering. But it also includes a third option, where companies want not only on-prem, but require that no data communication flows outside their firewalls and they control all aspects of the data.”

Like other ISO certifications, 27001 specifies the requirements that an organization needs to establish, implement, maintain and continually improve their information security management system. There are established metrics for encryption technology effectiveness. For data accountability of semiconductor data, that means monitoring who, what, when, and where for data access.

Fig. 1: Data pipeline at the simplest level. Source: Semiconductor Engineering/Anne Meixner
Fig. 1: Data pipeline at the simplest level. Source: Semiconductor Engineering/Anne Meixner

The sharing of data between manufacturing sites occurs over networks that differ in implementation for IDMs and fabless/foundry ecosystems. This might include, for instance, sending an assembly map from wafer probe performed at one factory to the assembly and test factory.

“If data is transferred between two manufacturing sites within the same company, then communication of data typically happens behind a company’s own internal firewall on secured networks,” said Mike McIntyre, director of software product management at Onto Innovation.

Data also must be shared between a fabless design house and its manufacturing partners. “Any data request from the fabless design house is messaged or published to a secured inbox on the manufacturing side. Typically, these access points need two-factor authentication to open and download the file,” said Melvin Lee Wei Heng, field application and customer support manager for software at Onto Innovation. “Then the file itself may have its own encryption key that is communicated through e-mail and must be applied before opening.”

Sharing data across the disparate players in the supply chain has resulted in most analytics companies hosting some sort of secure data transport at a manufacturing site. This is evident with test data.

“Data security is very, very important to our customers, and with all this data you might be passing data across some public networks,” said Jeff David, vice president of AI Solutions at PDF Solutions. “If you’re collecting data and you have several different test partners involved in testing a chip product, to share data between sites you have to pass it over public networks. So that data has to be encrypted. You also have to transport data between the DEX (Data Exchange) node and Exensio, which is our platform, and that can reside on the cloud. Yet all these test nodes are encrypted, and the data is private. The data is transmitted over an encrypted tunnel. It is important to note that only the intended destination server can decrypt transport payloads.”

Encrypting data makes it secure — at least while it’s encrypted. But determining who sees what data requires other technologies and processes, and this data privacy quandary presents a challenge for adoption of smart manufacturing within the semiconductor industry. As Dilorio wrote “Two of the primary issues that keep coming up in the SEMI Smart Manufacturing meetings, workshops and discussions are, ‘We need to share data,’ and ‘We don’t trust you with our data.’”

New approaches are being taken. One involves treating fab equipment like a patient in a medical setting. Applying FDA/HIPPA-compliant systems to fab equipment seemed natural to Dilorio, as he worked in that industry in which specific specs detail data security for electronic medical records. “Records have to be non-editable unless you have permission,” he said. “There has to be a full audit trail. You can’t change anything without telling somebody why. Even if you delete things, it’s still in the archive. You have to know when anybody looks at anything.”

This same mindset for monitoring and auditing in a semiconductor factory exists in technology, which resulted from a collaboration between Flying Cloud Technology and KLA. In a 2019 SEMI webinar Ravi Bhagat, senior director of IT infrastructure and security services at KLA, and Brian Christian CEO/CTO of Flying Cloud Technologies, described the data accountability problems they solved with processes and technology they developed. KLA’s goals included:

• Identify, monitor and protect IP, specifically product and chip design documents from KLA and its customers.
• Show good data governance and chain of custody.
• Maintain an audit trail of who, where, when, and how content was accessed, modified, or distributed.

The companies piloted two scenarios, one of which focused on product lifecycle documentation in Dassault’s Enovia PLM system in the context of crossing international borders. This is important due to cross-border export and sharing regulations that between countries, particularly involving leading-edge technology.

The process and technologies together provided auditing and tracking capability at both ends of the data pipe, from creation to consumption. It used blockchain technology to provide the decentralized ledgers, which provide the chain of custody details.

“We know when it’s created, when it was checked in, where did it go, who accessed it, where did it flow to, how was it used,” said Flying Cloud’s Christian. “Once we started having all of this data, we built analytics around having that type of data. You can begin to identify kind of the key creators, consumers inside the organization. In addition to identify possible data leaks, the analytics enable you to understanding how your data is being used within an organization.”

Not seeing the data, but still being able to use it, provides the “zero-trust” environment that can support smart manufacturing applications. But using that unseen data for ATE is a very recent development. From an ATE vendor point of view, how do you enable your customer to use outside data and predictive models to optimize test processes and meet yield and quality objectives in a cost-efficient manner? The recent partnership between Advantest and PDF Solutions aims to provide such an environment.

“The customer owns the test program,” said Keith Schaub, vice president of technology and strategy at Advantest America. “We don’t own it. But they want to do some miraculous things with that data on our tester. So we had to build in a shield around that data to provide that capability for our customers. In this zero-trust environment you don’t actually know where that tester is, and you don’t know who’s running it. You own the test program. You wrote it, but you gave it to someone, and now they’re running it, and your data is coming off of that tester. In addition, you want to provide external data to drive decisions in your test program. And you want to keep all of that data exchange highly secure. So that’s what ACS Edge endeavors to provide — it’s a zero-trust, secure production server environment that customers can utilize for live, in-situ predictions.”

Conclusion
For the most part, sharing data between semiconductor business partners remains either minimal in terms of the amount of data shared, or it occurs on an ad-hoc or crisis-driven basis. This impedes semiconductor engineers from delivering cost-effective and efficient manufacturing solutions.

Developments over the past five years demonstrate that new data management processes and technologies can facilitate data sharing while addressing both security and privacy requirements. With continuing discussions about the value of sharing data, the semiconductor industry needs to become more proactive in facilitating such sharing. It no longer can afford to be ad-hoc in nature. But this can only be done if companies design this data sharing to address the security an privacy concerns of data owners.

— Susan Rambo contributed to this story.

Related Stories

Predicting And Avoiding Failures In Automotive Chips

Infrastructure Impacts Data Analytics

Cloud Vs. On-Premise Analytics

Data Issues Mount In Chip Manufacturing



Leave a Reply


(Note: This name will be displayed publicly)