Before the industry begins shifting to a data-driven approach, ground rules need to be established.
The push toward data-driven design, debug, manufacturing and reliability holds huge promise, but the big risk is none of this will happen in an organized fashion and everyone will be frustrated.
One of the clear messages coming out of DVCon this week is that standards need to be established for data. Even within large chipmakers and systems companies, the data they extract from tools is not consistent. Moreover, given past experience with competing standards in chip design—think UPF versus CPF, and VMM vs. OVM, for example—they are concerned that data from the tools will not be presented in a consistent format.
No one is arguing that the future of system design will heavily rely on data. There are so many processing elements, memories, and internal and external connections, that the only way to deal with all of these pieces is to fully characterize them according to their electrical and physical properties, the software that will run on them, and their expected use cases and stresses. There is far too much data involved to “wing it” based on past experience, and no spreadsheet is long enough or wide enough to map out all of the potential interactions.
It gets worse as AI, security and reliability requirements are layered across these designs. Each of them adds its own voluminous amount of data. AI is being used inside of chips to prioritize the flow of data, but chips also are being developed to speed various AI operations. Those chips may be part of a multi-die package, where the data that is generated falls into a distribution. But one distribution is not necessarily the same as another unless the data is standardized.
The vision of data-driven design and manufacturing is that data can move up and down the design-through-manufacturing flow, so that a problem in one area can be traced to its source. This would have a big impact on system design costs, because reliability issues (including security vulnerabilities) can be pinpointed and eliminated rather than patched. In the automotive, medical and industrial markets, this is a huge benefit because it would sharply reduce expensive recalls and/or lawsuits, and it would help vendors maintain brand loyalty.
Taken together, all of this creates a huge opportunity for chipmakers and tools vendors. Bug-free designs can command a premium. But It also presents a challenge, because a free flow of data means that new tools from startups should be able to plug into existing flows. In reality, this is unlikely given that large suppliers will have much more tightly integrated tools, allowing users to zoom in and out on complex designs without losing their place.
Regardless, no one benefits if the data is wrong, incomplete or inconsistent. And this is where the design industry really needs to come together and develop a foundation that applies across all tools in all markets. Standardized data formats are a first step. After that, let the competitive wars continue.
Leave a Reply