Sharing data from design to the field can improve reliability, but it raises other questions for which there are no clear answers today.
Experts at the Table: Semiconductor Engineering sat down to discuss changes in test that address tracing device quality throughout a product’s lifetime, and over-arching concerns about data ownership and privacy, with Tom Katsioulas, CEO at Archon Design Solutions and U.S. Department of Commerce IoT advisory board member; Ming Zhang, vice president of R&D Acceleration at PDF Solutions; and Uzi Baruch, chief strategy officer at proteanTecs. What follows are excerpts of that conversation, which was held in front of a live audience at SEMICON West’s Test Vision Symposium. To view part one of this discussion, click here.
[L-R] Ming Zhang, PDF Solutions; Uzi Baruch, proteanTecs; Tom Katsioulas, Archon Design Solutions. Source: Semiconductor Engineering/Susan Rambo
SE: How can the industry ensure system-level reliability in autonomous cars, for instance, where a $1 chip could be switching power devices that make the car drive? What’s needed for the industry to bring the necessary level of self-test from the chip to the system?
Baruch: We look at the parametric trend that is happening from a functional standpoint. The monitor actually makes the chip a sensor for the system. It senses the power distribution. In the car I look at the functionality aspects, like how the software is impacting a chip from an operational stress standpoint. All of it is parametric and over time. But remember, we’re talking about monitoring, not sampling.
Katsioulas: The big issue here is what happens with the $1 chips. A question came up with DFT many years ago — did people find it useful? Everybody said, ‘Oh, it adds real estate.’ Today, it’s a must. So let’s get to the data. There certainly will be chips in automotive, industrial, aerospace/defense, which may not necessarily be high volume, but they have enormous value. And those will lead the way in creating data applications that come from the chip inside the $20/$30 chips, not the $1 chips. Once the value is established, I guarantee you will have people willing to put this silicon area into both.
SE: What barriers exist to implementing such changes?
Katsioulas: The industry is disaggregated, and we are very focused on domain expertise. That’s why you see it today. They didn’t level up enough. But there’s no reason why you should not be able to do it. And in the companies right now we’re seeing them moving toward system-level architectures. All that is going to happen because there’s no way you can actually do that unless you start bringing total solutions for testing across domain expertise. Today we have specialization that prevents that from happening.
SE: Chipmakers are running into problems where that methodology isn’t good enough to find some of the defects. For this ecosystem to work going forward, can it tie in-field monitoring back into the rest of the test flow?
Baruch: One of the things those monitors create is a common language. So when something happens in the field, and you want to root cause it back to either system-level or ATE, you have a snapshot of data that then enables you to do an apple-to-apple comparison across ATE, functional test, and in the field. At best, this requires heuristics to do the correlations to understand what happened. That’s why most things end up with, ‘No trouble found.’ What’s interesting is you may need to look at much wider datasets, sometimes to root cause to a process issue. Maybe it’s something during assembly stages, or whatnot. And then collaboration with PDF Solutions makes a lot of sense, because they enable collecting a lot of data sets across the manufacturing stages from test, assembly, inspection, process, design. Augmenting this data with something that moves along with the device into the field enables you to have a much more robust ecosystem for troubleshooting and analyzing data.
Zhang: Data is the driver for improvement. You brought up the point that the current test methodology is broken. I would say that test methodology currently is out of date. The reason why it’s out of date is because three new things are happening — domain-specific architectures, vertical integration, and at the system level, chiplets and heterogeneous integration. If we recognize the future will be mastered by the companies that own not only the systems, but also the chips and components, then this view from design to manufacturing deployment to lifecycle is very important. If we can agree on that, we can update the test methodology.
Katsioulas: And there’s a trend of virtual re-integration with companies like Google, AWS, and Microsoft. We still have a long supply chain, and the way it will come together is through some virtual IDM replication with an infrastructure that will connect them. That infrastructure came out of collaboration from the chip supplier, the foundry, the OSAT, the EMS vendor, etc., because at the end of the day it’s data in and work flows out. If you digitalize those, you can start building the infrastructure to correlate the data from chip testing all the way to PCB testing, which is what these verticals care about.
SE: When you started off, you asked how many test engineers would like to get paid for their test data? How might that become a reality?
Katsioulas: The IoT revolution will make it a reality because IoT is based on applications, on big data that will be monetized, and also solutions that come from devices. So by the time you start using this data, the first question asked is about data security. Is it trustable? Then you start working backwards through that chain. If you think about it, the only way you can actually monetize the data at the end application is by being able to create a digital thread all the way from design to manufacture. The answer is not here today, but certainly will be in the future. Everybody recognizes that the future is in the data. The chip industry is changing right now. They say, ‘We sold the razor, now we sell razor blades.’ And the razor blades are services. I can sell a lot of service on the chip. And those are based on the data the machine is producing. And that’s possible today. You see this in Intel with ‘pay as you go.’ It’s fundamentally the ability to provision the chip to offer new services.
Zhang: If the stuff you’re providing does more things, maybe I’m willing to pay for it. The other aspect of it is, how much value are we really bringing at the same nominal cost? That’s nominal cost of 3% of the die area, nominal cost 15 seconds of test time. We need to get more stuff out, get more data out early, and turn more of that data into insights with more precision and accuracy.
SE: I bought a new car not too long ago and my car asked me, “Do you want to share your data?” I’m curious how much will you pay me for my data?
Baruch: There’s no privacy aspect into the actual operation of what you’re doing. We’re looking at physical measurements that are happening on the device itself. I don’t know if that computer is doing one plus one. The actual usage of that data is to actually protect you. So from a warranty standpoint, for example, the car company would not ask you for that data for the sake of monetizing something that belongs to your car. It will apply applications in your car that will take advantage of the data to help, for example, in functional safety. It wouldn’t be allowing you to monetize it because it’s not measuring your behavior or your preferences or anything personalized to you — at least not from that kind of technology.
Katsioulas: The question is who owns the data. If I build an IP, my IP produces data. I own the data for the IP. Then AMD takes that IP, puts it in a chip, and creates new data. AMD owns the derivative data. And the intention with a digital thread in the value chain, with the nested licensing of data, we cannot go to you and say how much we’re paying for your data because your data is coming from an IP, from a chip, from a PCB, from a device, from an application. Therefore, how much are you willing to pay for the portion of the data that applies to you?
SE: So let’s examine the privacy statement. There is no privacy related to parametric data. But if somebody can capture that data, it could contain competitive data. For example, if you look at Amazon versus Google in their data centers, data that shows how their SoCs age over time could lead to a conclusion of one being better than the other.
Katsioulas: The number one question to the IoT advisory board for the Department of Commerce is, “How do you protect data confidentiality?” This is not a privacy issue. Privacy is with the consumer. Confidentiality is with the enterprise. I’m an enterprise and I produce the data. I keep it inside. What we’re proposing is people don’t share data. People selling information share metadata indices about the data in order to create a marketplace for data producers and consumers. The transaction of selling and sharing the data is a private transaction between two parties. So you need a digital thread of data in a nested license key scheme. Everybody who follows a piece of data can propagate it down downstream. If Cisco develops a device from chips and starts generating services agreements with data, the fact that some of the data came which side is relevant. Cisco owns a Cisco monitor. If I buy the chip, it generates the data is owned by whoever designed that chip.
Baruch: In the end, the monitoring is owned by the silicon company that implemented it. The reason I’m saying it is because to extract the data out of that device, you need the framework code to enable that. This is something that the silicon company owns 100%. Actually, it’s even more than that. The moment this chip goes into a PCB, it’s enabled by a different company that bought the chip and then assembled it in the whole system. That data would not be available on the system unless the chip company allows it. You have the chip, the system, and an OEM. Enabling that data across the thread requires the consent of all three. But the core of it comes from the chip.
Katsioulas: I will leave you with this thought — the world will become data producers, data consumers, data enablers, and data analyzers. If you start thinking that way, then we also can understand how data ownership will proliferate.
View part one of this discussion: Improving Reliability In Chips.
Related Reading
Leveraging Chip Data To Improve Productivity
Collecting, analyzing and utilizing data can pay big benefits for design productivity, reliability, and yield.
Leave a Reply