Domain Expertise Becoming Essential For Analytics

More data doesn’t mean much unless you know what to do with it.

popularity

Sensors are being added into everything, from end devices to the equipment used to make those sensors, but the data being generated has limited or no value unless it’s accompanied by domain expertise.

There are two main problems. One is how and where to process the vast amount of data being generated. Chip and system architectures are being revamped to pre-process more of that data closer to the edge because it’s too resource-intensive to send that data back and forth. That leads to the second issue, which is that some data is critical and needs to be acted upon immediately, while other data is either useless, or useful only in the context of data gathered over time or in specific situations.

But figuring out what gets processed where depends upon knowledge of the market segment, the context of that data, and the use models. All of those factors can vary from one application to the next, and even from one company to the next in the same market.

Consider what happens in chip design, for example. “Domain knowledge is important, but with a twist—not in the traditional sense,” said Mike Gianfagna, vice president of marketing at eSilicon. “Where domain expertise comes in is when you bring up a chip. A working chip is not the end goal. You need to do more than that. You need to deliver the performance of a chip in a system. The issues are with the protocol, the package and the board interaction. There’s also IP interaction. So now you’re using the domain expertise of the end customer to bring up a chip and adding in substrate vendors, package vendors, IP vendors and firmware vendors. Those are all relevant to success.”

It’s no less complicated on the manufacturing side, where the number of different steps and the order of steps can affect the value of that data at any particular instant in time.

“It’s not just about making data analytics faster,” said David Park, vice president of marketing at PDF Solutions. “It’s also recognizing the need for additional sets of data that you might not have. That requires putting instrumentation into the design being manufactured, and it has to be done in a way that is not damaging to the design. For example, you could look at instrumentation with e-beam and find things at Metal 0.”

Understanding where and when to add that data typically requires a deep understanding of the application, the market, and how that data can and should be used. But in many cases, there is no precedent because the markets or technologies are so new.

“The way Lam is addressing this is to focus on the critical problems we should be solving for our customers—what are the key differentiators that will enable our customers to lower their cost of ownership, to boost their productivity, and to maximize their yield—and prioritizing those kinds of applications,” said Rick Gottscho, CTO of Lam Research. “That will drive your data strategy, including what kinds of data you’re going to generate, what form, how you’re going to massage it and transform it. It’s all about the value-added application—what you do with the data—that should drive that strategy.”

Behind the scenes, chipmakers, tools and equipment vendors, and systems companies have been adding expertise wherever they can find it, either by training their own teams or through acquisitions. There is a widespread recognition that domain expertise is essential for growth, and it has changed how companies are viewing AI these days. In the past, AI was considered a way of replacing highly trained and expensive professionals in a wide range of markets, from radiology to financial investing. That view has shifted dramatically over the past 12 to 18 months.

“The goal is to present things you can do with very high confidence, so this isn’t just a black box,” said Jeff Welser, vice president and lab director at IBM Research Almaden. “The goal is to augment what an expert can do. There are things AI does very well. It might be really accurate, but it still makes mistakes. You need humans there to understand the context, and in some cases there are ethics involved.”

Building domain expertise
Adding domain expertise isn’t always as simple as hiring people who have worked in a particular field for a period of years. With new technology—self-driving cars, medical sensors, industrial IoT, or anything involving AI—that expertise usually requires multiple players and a different way of collecting and analyzing data.

“We have a group of automotive experts,” said Uzi Baruch, vice president and general manager of the automotive business unit at Optimal Plus. “When we have an engagement, these guys will go on a factory tour and learn what’s going on, and then apply what they’ve learned in past engagements. We already recognize what’s going on at the initial visit. From there we need to ask the right questions, and then we can come back and say, ‘This is the data we need to get.’ We recently worked with a Tier 1 company on a project involving welding operations. We know there are certain characteristics involving welding, but at the end of the welding operation normally you have a visual inspection. So we looked at the images as a starting point. This is one touch point. But it doesn’t stop there. Then you go back and look at functional test before that. And then you go back to the welder. So you start from the imaging and inspection part, and then you shift left. You go back in the process.”

Once all of that is in place, the data can be analyzed. “Machine learning, which is already on the welder, can predict the characteristics that would end up in a bad image,” Baruch said. “From there you can figure out what’s wrong and alert the MES (manufacturing execution system) not to continue with that part.”

This kind of analysis is particularly important in areas such as security, which are just beginning to utilize data analytics and AI. But because systems and use models vary so widely, the domain expertise required might be as narrow as an an individual company’s data flow, or even the behavior of a single chip or device.

“Our data comes with a lot of meta-data and semantics,” said Rupert Baines, CEO of UltraSoC. “If it is a bus monitor, then we know what fields mean. So it is never just raw numbers, and our tools and analytics support those natively. And since we have a lot of tools that can identify deadlocks or contention, for example, they can derive causality and infer what is causing the problem. For those applications our tools are very generic—issues like contention, cache behavior, memory controllers. This is applicable for any SoC, and very powerful in problem solving. But we also see more focused domain expertise. Things like cybersecurity or functional safety are very specific to a particular application or a particular system. Often those are the more ‘in-life’ applications, where a lot depends not just on our customers needs, which is the SoC, but their customers as well, which means the system it is used in.”

That view is echoed in the security world, where the challenge is to develop tools and processes that are can be scaled, but which also provide programmability to make them effective enough for a particular company or operation within a company.

“This is where the domain expertise comes in,” said Tejinder Singh, general manager for security solutions at Marvell. “These are really different products that cut across all use cases, from storage to wireless to AI to automotive. So within our company we have use cases we touch, and we talk to each other about those. As a company we know a lot about IoT, for example, and we can build a team of experts around that. But even within a vertical, such as banking, there are different requirements.”

New applications, new technology
Not all of this obvious, though. Domain expertise doesn’t exist yet in certain parts of 5G, for example, where most of the chips have been manufactured in limited batches and tested in ways that are not ready for volume manufacturing.

“The whole industry is in learning mode on 5G, especially on millimeter wave,” said Adrian Kwan, business development manager at Advantest. “In China and Japan, they have already started on sub-6GHz 5G infrastructure for line-of-sight communication. This requires ASICs to be a little more intelligent. They have to identify users in a cell and figure out how to manage with adaptive phase array antennas. There also are different regulations in each region and different service providers.”

This all takes time. Domain expertise is segmented across the supply chain, and sifting through data requires a process to be put in place. With new technologies, processes are still immature and data is changing almost perpetually.

“There is a lot of innovation around millimeter wave,” said David Hall, principal marketing manager at National Instruments. “The first components were tested with unsustainable manufacturing practices. The ATE equipment was not ready, so a lot of this was done with network analyzers for production test. There is a huge rush to have more sustainable, lower-cost manufacturing practices. The marketing need outpaced the technology here. We still need over-the-air testing for antennas at volume and scale. The other area where we’re seeing a problem is that there is a huge desire to do more modulation quality versus power/frequency test. If you want to improve certified quality, that will require new test techniques.”

Automotive is another area where processes and technologies are in flux. While companies prepare for increasingly automated driving, no one knows exactly how those technologies will evolve or which ones will win. There is a flood of data coming from automotive sensors, including streaming video, and there will be more as additional sensors and cameras are added into vehicles. But domain expertise in this area is largely derived from either the automotive mechanical side, or the semiconductor side.

“You’re certainly seeing players in the game you wouldn’t have seen five years ago,” said Rob Cappel, senior director of marketing at KLA. “There are people designing their own chips—Google, Apple, Amazon—that may not just be for cars. They’re looking at artificial intelligence. The ecosystem as we knew it five years ago is changing. With automotive, the ecosystem across the board, from the big players all the way down to the semiconductor fabs, all agree that quality and reliability are key.”

The challenge now is connecting those different worlds and leveraging the expertise of all of them.

“This still looks like a huge opportunity to bridge the knowledge of the semiconductor industry with the automotive industry,” said Burkhard Huhnke, vice president of automotive strategy at Synopsys. “Right now there is a gap of knowledge. It doesn’t exist because this is still a mechanically oriented industry. The key differentiation here will happen in software and simulation involving processors and wiring harnesses. You need to simulate whatever is in the car. And as everything goes electric, that will require high-voltage components, which are not well known in this market. It will be inverters and IGBTs. You will need to simulate transient characteristics of everything. This is the beginning of creating understanding. It might be a core competence of the future for care companies.”

Putting a value on expertise
While most industries ultimately consolidate around expertise, what’s new in the chip industry is the number of markets for technology that never existed before. Either the technology is brand new, which is the case for flexible or ingestible sensors for medical applications, or there are existing markets that have never used advanced technology, which is the case in automotive and industrial IoT applications.

And what’s changing from a business standpoint is that companies are having more success by focusing on specific markets than trying to go horizontal, according to Wally Rhines, CEO Emeritus at Mentor, a Siemens Business. He noted that when Texas Instruments specialized in analog its earnings rose sharply. Similarly, NXP more than doubled its portfolio of automotive between 2015 and 2017, sending its earnings soaring.


Fig. 1: Texas Instruments benefited from specializing in analog. Source: Mentor/CapIQ


Fig. 2: Impact on earnings of NXP’s specialization on automotive. Source: Mentor/CapIQ

These trends are not anomalies, but they aren’t always easy to replicate because expertise isn’t always available. What is clear is that data and the analytical capabilities to slice that data into trends have limited value on their own. But when combined with analytics and expertise, the combination can have a big impact on technology, companies and overall markets.

Adding domain expertise to good data and data analytics is a winning formula, and that is being proven in multiple markets and technologies that have no apparent connection to each other.

Related Stories
Using Sensor Data To Improve Yield And Uptime
Deeper understanding of equipment behavior and market needs will have broad impact across the semiconductor supply chain.
Dirty Data: Is The Sensor Malfunctioning?
Why sensor data needs to be cleaned, and why that has broad implications for every aspect of system design.
Pushing AI Into The Mainstream
Why data scrubbing and social issues could limit the speed of adoption and the usefulness of this technology.
IoT Merging Into Data-Driven Design
Emphasis on processing at the edge adds confusion to the IoT model as the amount of data explodes.



Leave a Reply


(Note: This name will be displayed publicly)