Many Chiplet Challenges Ahead

Assembling systems from physical IP is gaining mindshare, but there are technical, business and logistical issues that need to be resolved before this will work.

popularity

Over the past couple of months, Semiconductor Engineering has looked into several aspects of 2.5D and 3D system design, the emerging standards and steps that the industry is taking to make this more broadly adopted. This final article focuses on the potential problems and what remains to be addressed before the technology becomes sustainable to the mass market.

Advanced packaging is seen as the most likely path to keep Moore’s Law going, but there are several challenges that have yet to be overcome — and several lurking pitfalls for the unaware. Some of these are technical, some business, and other just because the necessary skills are distributed today, meaning that there could be knowledge gaps created by silos.

Much of the emphasis so far has been on functionality, including how designs should be partitioned and how to logically stitch them back together. The bulk of this work has shown up in products from several vertically integrated companies, which often have applied their traditional tool chain when conducting the necessary analysis. But that only works when you have complete access to all pieces of the design, which is why IDMs have been the first out of the gate with chiplets.

“It goes way beyond simply the functional,” says Marc Swinnen, director of product marketing at Ansys. “There has to be analysis across the entire system for electromagnetic, thermal, signal integrity, warpage and mechanical stress. They all have to be analyzed in concert, and this will drive a significant change in the EDA design flow.”

There is a flurry of development activity in the packaging area. “Chiplets are a way of logically partitioning your system, versus packaging, as a way of physically putting things together,” says John Park, product management group director for IC packaging and cross-platform solutions at Cadence. “They are different in my mind. As soon as you go to bumpless stacking, today’s chiplets won’t work because they use microbumps. You can disassociate the world of chiplets versus the future of 3D stacking. They are loosely coupled, but they’re not the same thing.”


Fig. 1: Chiplet-based system using an interposer. Source: Cadence

Still, chiplets and 3D do share many common issues. “There are a lot of chip assembly formats, and none of them, to my knowledge, is supporting power or thermal modeling,” says Kenneth Larsen, director of product marketing for Synopsys. “This is essentially what you want when you begin to stack things on top of each other. Thermal is a tremendous problem, and so is power distribution — providing power to the chips.”

This industry is in full agreement on this. “The killer problem for 3D-IC is power dissipation,” says Ansys’ Swinnen. “Temperature analysis often has been an afterthought in the chip industry, even though it has been more prevalent in the system- and board-level design. It is now becoming front and center for 3D designers. You need much more sophisticated thermal analysis. Boundary conditions for temperature, electrical power set points for the power distribution network, which is very complicated in these big structures — all of these have to be simulated together to converge on a solution.”

Thermal analysis is just the beginning, too. “Once you have done the power and thermal analysis, you can assess the differential expansion on these elements,” adds Swinnen. “That raises questions about warpage, thermal expansion, and the mechanical integrity of these designs.”

For existing chip development teams, there are unknown unknowns. “These are what cause some degree of trepidation for back-end engineers, or physical design engineers, or process-related process engineers,” says Rob Mains, executive director for the CHIPS Alliance. “Architects will come up with ideas that may sound good on paper, but which may not be practical. When you start bringing in your back-end people, including the physical design team, the electrical analysis team, or the process engineering teams, packaging teams — that’s when these concerns start to come out.”

There are several issues related to putting chiplets together. One of them is just about physical size. “The size of the bits and pieces is an issue,” says Michael Frank, fellow and system architect at Arteris IP. “It is perhaps less of an issue with chiplets or 2.5D, where things are mounted on a substrate, but it adds additional challenges for 3D. We are no longer dealing with gravel. It is grains of sand, or even dust specs. It’s more robust to build boards.”

Handling is more than just dimensional. “The Optical Internetworking Forum (OIF) is looking at some of these issues,” says Manmeet Walia, senior product manager for high-speed SerDes at Synopsys. “They are defining the electrostatic discharge (ESD) standards, as an example. It is a lot lighter, with a lot fewer protections than exist for the chip-to-chip environment.”

The sensitivity and robustness of the chiplets requires special handling. “In transport they have to be carried in dry nitrogen to keep them protected,” says Arteris’ Frank. “Everything is at a different scale, but someone who builds millions of a system might be able to deal with it. Look at the existing mechanisms to protect chips. If you have a chip in 5nm, you cannot come anywhere near it with even 3V because the oxide can’t sustain that. Plus, the ESD protection in these chips is only functional if the thing is powered up. When it is not powered up, it is completely susceptible to ESD.”

Another talked about, but as yet unresolved issue, is test. How do you ensure the dies are good before assembly, and how to you test that the complete assembly is also good? “Built-in self-test (BiST) has always been an important requirement for SoCs, and the capability of self-testing is crucial for chiplets to be usable in the system,” says Ashraf Takla, president and CEO for Mixel. “Testing the chiplet at the wafer level, and after assembly, is mandatory. These are similar test requirements for other applications, such as automotive and medical.”

Chiplets require a different test methodology. “IP has to come with heavy tests and diagnostic features,” says Walia. “They have to, because nothing is coming outside the die. All the testing needs to be accomplished within the die. Even when we look at our own test chips, our own packaged test chips, nothing is coming out on the connector. Everything is done inside the die. They come with a lot of test features, and not just your standard BiST and loopbacks. There are a lot of different ways we need to stress test these IPs, like sweeping the references and voltages and non-destructive eyes. On top of the test and the known-good-die issues, we also have to build in redundancy, because once you build a full package with an interposer or organic substrate, each one is $100. And even when you’re putting these packages together, there are wires that can get broken in that process.”

In addition to industry solutions, DARPA programs attempt to address some of these issues. “The State-of-the-Art Heterogeneous Integration Prototype (SHIP) program includes being able to create all of the technologies that are necessary for packaging and testing,” says Jose Alvarez, senior director in the CTO Office for the Programmable Solutions Group at Intel. “Then there is Rapid Assured Microelectronics Prototypes (RAMP), which is looking to advance microelectronics physical back-end design methods. This sends a clear message that the U.S. government is very much encouraging the domestic semiconductor industry to get to the level that can be sustainable long term.”

Changes to IP
Chiplets certainly create new challenges, but they also provide new opportunities for the IP industry. “Since 2.5D chiplets do not fundamentally change the nature of most component IP, like CPUs, GPUs or NPUs, there’s no change to design or verification methodologies for this IP,” says Peter Greenhalgh, fellow and vice president of technology at Arm. “For coherent interconnect design and verification, some additional steps are needed to ensure scalability to a chiplet environment, but it’s not significant. As the industry moves to 3D integration, there are further opportunities to partition IP across dies.”

Additional models are required to make this work. “You need to be able to send a chiplet to some other company and have it sufficiently described so they can integrate it intimately into their design,” says Swinnen. “At the same time, you need to preserve your IP. This is not so different from IP that is being sold today, where you also have NDAs and proprietary issues that can be resolved. There are technical issues and some legal sides. And there have to be technical standards that people can create a marketplace around.”

Those marketplaces do not exist yet. “What type of information should be in a chiplet catalog?” asks Chris Ortiz, principal application engineer at Ansys. “Is there a standard? What information needs to be there for you to do thermal analysis, or power analysis? This is necessary information that can help people make decisions about what type of packaging they can use. If it is fairly simple, they may just need to look at thermal and conclude they can use some fairly inexpensive package. Or, will you need to go with a more expensive CoWoS (TSMC’s Chip-on-Wafer-on-Substrate) type of design, or silicon interposer type of design?”

When fabrication and handling issues are coupled with IP, it is not even clear who the IP companies will be. “It could be design companies who are already selling chips,” says Wendy Wu, director of product marketing for the Cadence IP Group. ” These are our customer today for IP, and now I’m starting to see them looking into the chiplet segment. They are asking for some of our interface IP, and then to combine that with their core IP. They could buy a design from an IP company, then manufacture it, test it, and maintain inventory. That might be a pretty workable model.”

Not only are standards required, but standardization of parts also may be necessary. “The OCP ODSA group is an industry-wide collaboration working on developing standards to drive interoperability of chiplets from independent vendors,” says Tony Mastroianni, advanced packaging solutions director for Siemens EDA. “They have established a Chiplet Design Exchange (CDX) working group to focus on standardizing chiplet models, implementation work flows and test methodology. The CDX working group is actively working on these standards, but it will take time to solidify the standards and provide design and test flows, and then adoption by the chiplet providers.”

Business models
The business model for chiplets is highly dependent on market size. “You need an ecosystem and the infrastructure to provide the individual chips,” says Frank. “Will that market be big enough? Think about doing substrates, building chips – the cost is prohibitive, especially in the advanced technologies.”

That is made more difficult by having multiple interface standards emerging. “If you design something based on, for example, Bunch-of-Wires (BoW), then it’s not usable in any other context,” says Swinnen. “You can’t take that same chiplet and package it in regular package and sell it in the normal market, because it only works in that chiplet context. You really have to design your chip for a specific market, which raises the chicken and egg problem. Who’s going to design their chip that way unless there’s a market? And who’s going a build a marketplace unless the chips are available?”

Large companies want to make this happen. “We need to build a larger ecosystem for this to really take over the industry and that’s what we’re interested in,” says Intel’s Alvarez. “That’s why we are interested in open source. That’s why we’re interested in working with the CHIPS Alliance.”

Tools and flows
The tools and methodologies being built today are principally cobbled out of existing capabilities. “In addition to standardized models and test methods and an established chiplet ecosystem, EDA vendors will need to provide more comprehensive, integrated design flow solutions to enable the broader design community,” says Siemens’ Mastroianni. “This will include the integration of system-level design and verification, advanced package design and analysis, IC design and analysis, and DFT and test tools, methodology and infrastructure. It is unlikely that a single EDA vendor can provide best-in-class solutions for all of these technologies, so an open, configurable approach will likely prevail. This will be a daunting challenge, and the facilitation of a broad-based 3D solution will be even more challenging.”

Swinnen agrees. “I don’t think the industry is going to get there by taking existing tools and bolting on some additions and expecting it’s going to handle 3D-IC. It’s the next inflection point in the electronic design automation market. We’ve had inflection points in the past, such as the inclusion of IP. We’ve had finFET as a technological inflection point. 3D-IC design is the next inflection point, and it’s coming to project near you soon.”

There may be mindset changes required here, as well. “How does the EDA industry address larger concepts in not just 3D, but also this disaggregated environment,” asks Alvarez. “How does it enable designs that are built in a much more agile and flexible manner? How does it enable far better time to market than we do today?”

On top of that the tools have to be able to deal with a larger context. “One is the problems is capacity,” says Swinnen. “Whereas some of these very large chips that we have today are very compute-intensive to analyze, now you’re going to put three or four of them on top of each other, plus an interposer and analyze the whole thing together. That raises the capacity problem even further. We’re talking about electromagnetic effects, which are often non-local, like the guard rings around a block or the coupling around the die. They all have these non-local electromagnetic effects. Signal integrity becomes much more complicated. It is not just the chip, but the interposer, TSV, package — it all has to be integrated together.”

As often happens, abstraction may be the way forward both for tools and IP. “The scale of these systems requires some level of abstraction, especially when you look at thermal,” adds Swinnen. “For 3D package analysis, you don’t need to know thermal data for every single gate, but you do need to know how the chip reacts and which regions of the chip are getting warm. That brings up reduced-order models (ROMs). These are definitely part of the complete picture for analyzing these and exchanging data between IP provider and integrator.”

Conclusion
While there are many hurdles getting to a full, commercially viable market for chiplets, and system design that can fully utilize them, a significant portion of the industry wants to see it happen. The creation of the soft IP market was not easy, but it completely transformed the industry. The same could be true for physical IP in the form of chiplets. None of the challenges is insurmountable, and some of the barriers already are coming down.

But how long it will be before chiplets are an integral part of the industry is not clear. We can expect to see significant announcements within a few years, although it’s not certain that will push it to become the dominant way to create systems. Some important people believe in the concept, but there are many more sitting on the fence today.

Related
Designing 2.5D Systems
Connecting dies using an interposer requires new and modified processes, as well as organizational changes.
Waiting For Chiplet Standards
An ecosystem is required to make chiplets a viable strategy for long-term success, and ecosystems are built around standards. Those standards are beginning to emerge today.
Chiplets For The Masses
Chiplets are technically and commercially viable, but not yet accessible to the majority of the market. How does the ecosystem get established?



Leave a Reply


(Note: This name will be displayed publicly)