Signal Integrity Plays Increasingly Critical Role In Chiplet Design

Chiplet design engineers have complex new considerations compared to PCB concepts.

popularity

Maintaining the quality and reliability of electrical signals as they travel through interconnects is proving to be much more challenging with chiplets and advanced packaging than in monolithic SoCs and PCBs.

Signal integrity is a fundamental requirement for all chips and systems, but it becomes more difficult with chiplets due to reflections, loss, crosstalk, process variation, and various types of noise and physical effects. Electrical signals need to arrive at their destination at the right time, waveform shape, and with consistent voltage levels. This was hard enough in monolithic chips, but it takes on a whole new dimension in advanced packages.

While signal integrity analysis has been around for decades, it can become unwieldy very quickly with chiplets due to a substantial increase in the number of die-to-die connections. Other factors to consider include impedance matching, signal attenuation, and timing constraints to ensure reliable communication between chiplets. And these issues become more difficult to manage as data rates rise, feature sizes continue to shrink, and as chiplets developed at different nodes and in different sizes are added onto some type of substrate, and often integrated into some type of advanced package in a custom configuration.

“Chiplet-based semiconductor design, or system-in-package, presents unique challenges compared to traditional monolithic designs, particularly in signal integrity,” noted Mayank Bhatnagar, product marketing director of SSG at Cadence. “While signals in monolithic designs are shorter and more predictable, chiplet-based designs have inter-die connections across substrates or advanced packaging like interposers and bridges. These longer paths crossing material boundaries introduce impedance mismatches, signal degradation, and crosstalk. Power delivery, which is centralized in monolithic designs, also becomes more complex, with multiple chiplets requiring careful noise mitigation.”

While advanced packages filled with chiplets have more area to work with than a monolithic SoC, that doesn’t solve the signal integrity problem. “On one hand, in chiplets, all the traces are much more tightly packed, which results in significantly more crosstalk,” said Andy Heinig, head of Efficient Electronics in Fraunhofer IIS’ Engineering of Adaptive Systems Division. “On the other hand, more space is allocated for the power supply, leaving less room for the signal traces.”

Signal integrity for chiplets differs in two main ways from monolithic chips. “First, the interface itself will drive requirements for the die-to-die (D2D) interface that needs to be tailored for that interface,” said Javier DeLaCruz, fellow and senior director of system integration and development at Arm. “Second, the addition of the interposer and additional package layers for the die-to-die interface will have an impact on the non-die-to-die signals, and they will need to traverse the interposer and additional package layers.”

In fact, ensuring signal integrity is one of the biggest challenges for 3D-ICs. “When you go off-chip, these are extremely high-speed connections,” said Marc Swinnen, director of product marketing at Ansys. “These are the SerDes that connect the chips together, and these channels — even though they’re nominally sending digital data back and forth between each other — are very much analog circuits. And at the extreme speeds that they run, they need full electromagnetic (EM) modeling to calculate them. This means it’s not just RC (resistance/capacitance), but resistance, inductance, and capacitance (RLC). And there’s mutual self-inductance, so you need a full EM modeling of the signals on the interposer because of the high speed. RF designers have been familiar with this for years, but for most digital designers, this is a new concept. They have to get into EM, and it’s analog.”

TSVs and bumps need to be modeled for chiplets, as well. “At lower speeds, these bumps and TSVs are just a resistance or capacitance,” said Swinnen. “But at higher speeds, the EM modeling is needed. TSMC had to develop an RLC model for the through-silicon vias in its N3, which means even the vertical connections have to be electromagnetically modeled for that. That’s the big difference, along with capacity now, because you’ve got to identify your path — leaving one chip and possibly hitting multiple chips, going through the interposer, and even to the package if it’s an I/O signal. You need to be able to model the entire signal path, beginning-to-end, across multiple chips and an interposer. That is the major issue with electromagnetics and signal integrity. Then, if you want to do the path, you have to do the good old-fashioned RC for the lower speed or the on-chip communication.”

EM analysis is typically more complicated than RC extraction, but it’s also more limited in capacity. “EM analysis can do thousands of signals, but not millions or tens of millions, as RC extractors can do,” he said. “This means you typically have to isolate only those signals that are of interest for the high-speed communication, and then stitch in any other lower speed signals, or just analyze the communication channel on its own, like the HBM communication channel is a typical example. When you’re talking from a chiplet to the HBM, there’s all the signal lines, and we’re talking 1,024 bits at the same time. That’s quite high bandwidth, and it’s interleaved with power and ground for shielding, so it becomes a complicated physical and electrical channel that you need to simulate the entire electromagnetics for.”

PCB similarities and differences
Many of the multi-die and chiplet issues confronting chip design teams today are similar to what they saw on a PCB. Even some of the language is the same. But those issues are becoming much more complex and more difficult to solve with multiple chiplets in an advanced package level

“PCB designs have had to integrate memory and CPUs, and all kinds of individual components together, and the tools used for PCB design enable you to do that analysis at the speeds at which they are required,” said Keith Lanier, technical product management director at Synopsys. “The dimensions, of course, are much bigger for PCBs. So it’s really a scale difference, and that’s where, as we go into the new direction of multi-die, as we start to look at the IC style of routing that’s required for interposers, or embedded bridges for wafer-to-wafer bonding — all of those interconnect types are not what you have on PCB designs. There are C4 bumps and things like that, and you still have to deal with those. But you also have to deal with signals that could come out to the real world and connect all the way out to a PCB, and then into a bigger system, as well as all the interconnects that go from die-to-die. The whole chiplet ecosystem that people hope will become a reality in the future must have more standards. But the question is, ‘What tools are needed to look at the interconnects from a signal integrity standpoint, as well as design for power integrity, across these multi-die systems?’ You need to take into account that power drives thermal, and thermal can have lots of different impacts on those multi-die systems and chiplet designs.”

Analyzing all of this is a challenge. “Two things dictate the types of analysis that the design engineer needs to take into account with the physics for these new multi-die systems, compared to PCB,” Lanier said. “First, the design engineer has to look at the design feature sizes, and consider using a foundry process, as opposed to an OSAT or an organic substrate. The dimensions are much smaller, and yet the speeds are much higher, and those two things go together in terms of driving the need to do electromagnetic simulations in the IC domain, be able to use it for digital designs that require that type of analysis, and be able to use it for multi-die designs. These tools could be utilized on PCB design, and they could be utilized for classic packaging products, but multi-die and chiplet designs require even more capacity for the same accuracy. That’s another challenge. The multi-physics analysis is now much more complex in the sense of the scale of the size of design compared to the original PCB designs.”

The chiplet effect
Signal integrity challenges can vary widely by application, too. “In monolithic design, it used to be that signal integrity was done by a separate group of people on the PCB side, and they perfected that art,” said Subramanian Lalgudi, product specialist at Siemens Digital Industries Software. “There was a process as to how they wanted to sign off on compliance. Today for chiplets, there are different protocols — UCIe, MIPI, SATA. If you are a chip designer who is designing a transceiver, or if you are a board person like HP, or somebody else is designing the board, or if you are a repeater company trying to take that, amplify it, and send it to something, that process is clear. The standards have evolved as to the compliance required at the transmitter. But what is the compliance required at the repeater? What is the compliance required at the receiver, both for serial standard, also for parallel standard? Serial is point-to-point. The parallel is basically the DDR applications there, but the energy per bit was all pretty high in the PCB so they could tolerate it. It’s a bigger surface area and so on and so forth.”

When chips were monolithic, there were just proprietary considerations. “There was no standardization,” Lalgudi said. “The moment chiplets came up, they needed to do static timing analysis, which is a clock-to-clock task that makes sure all the bits arrive on time before it can latch on and go and do that. There is a set-up time. There is a hold time. This used to be called static timing analysis, but the moment they introduced chiplets, that changed. The chiplet guys or the producers may be different from the guys who integrate them. Intel and AMD already have shown that. Intel has taken FPGA designs, and they can mix and match stuff. They can go with the processor on one technology node, they can go with chiplets on older technology nodes. This is beneficial because now they can focus on what they’re really good at.”

The key to solving these challenges is breaking the problem into different levels. “If you’re in the early exploration stage, you may not have a full set of design rules,” said Synopsys’ Lanier. “But you still need to be able to make certain tradeoffs and do feasibility studies, exploration, and be able to understand how those chiplets can be used within the system and configured in a way that makes it so the die-to-die connections are short enough to give you the speed to handle your workloads. You have to make sure the power delivery networks are designed to meet your power goals. In parallel with that, you’ve got to make sure that the thermal interface materials, all the different components — not just the die, but the actual assembly of those die together — can still meet maximum temperature of the die, and the maximum operating temperature across the full spectrum and corners that you might have to face through your system. It’s really the complexity of designs and the speed of designs that are becoming more critical to making sure these tools are available can support them.”

The industry is advancing in several areas in order to address these complexities and improve predictability. “Packaging technologies, such as silicon interposers and fan-out designs, are reducing signal loss and enhancing interconnect performance,” said Cadence’s Bhatnagar. “Standardized interfaces such as UCIe are streamlining inter-die communication, while machine learning is enabling faster SI analysis and predictive modeling. Improved materials and 3D integration with techniques such as hybrid bonding are further enhancing SI performance by reducing interconnect distances and losses. Also, next-generation EDA tools are integrating SI, PI, and thermal analysis into a unified framework, reducing iteration cycles and improving accuracy. And high-speed interfaces now incorporate advanced equalization techniques to mitigate losses in the package or interconnects. Such advancements in packaging, interconnect standards, and simulation technologies are steadily making these challenges more predictable and solvable.”

Still, there are more challenges to address. “Timing closure becomes harder as chiplet systems encounter significantly worse process variations across chiplets, as well as substrate-induced skew, in contrast to the unified timing domains of monolithic architectures,” Bhatnagar said. “Both approaches, however, also share common SI challenges, such as the need for robust simulation tools, material effects analysis, and reliability testing under process, temperature, and voltage variations. Tight integration of SI and PI is essential in both cases to manage noise impacts on signal performance.”

The writing is already on the wall for some of the existing methodologies. “So far, signal traces are often routed based on rules,” said Fraunhofer’s Heinig. “This creates much more confidence, but it uses many resources. This will no longer work, and in the future a continuous, unified planning of power and signal traces will be required throughout all design steps. This includes everything from system partitioning to layout, which will need to be supported by far more simulations.”

Conclusion
Tools and methodologies used for system-level signal integrity analysis in PCB design are now being applied to chiplet designs. The EDA industry has been doing signal integrity analysis for more than 40 years, but these techniques now must be applied at the chiplet level. Some of the tools and techniques used for signal integrity analysis include eye diagrams to visualize signal quality, electromagnetic simulations, static timing analysis for digital signals, and full-wave 3D EM modeling for high-speed interconnects. Here, signal integrity validation for chiplets involves analyzing the behavior of signals traveling between transceivers and receivers through interconnect channels, ensuring compliance with emerging chiplet interface standards.

As the companies interested in developing a commercial chiplet market continue to progress, these issues will be worked out in stages. The starting point likely will leverage existing tools, while incrementally adding the features and capabilities to address signal, power, and thermal integrity in chiplets.

Related Reading
Chip Architectures Becoming Much More Complex With Chiplets
Options for how to build systems increase, but so do integration issues.
Chiplet Interconnects Add Power And Signal Integrity Issues
More choices, interactions, and tiny dimensions create huge headaches.



Leave a Reply


(Note: This name will be displayed publicly)