Signal Integrity Through The Years

How what starts out as a little problem gets progressively worse over new process generations, eventually requiring integrated tools to address.

popularity

Yesterday, I started to talk about how new technologies find their way over time into EDA tools in my post How Technologies Get into EDA. Let’s look at signal integrity as an example.

We used not to worry about signal integrity at all. The first time anything like that impinged on my consciousness was in the early 1980s when we realized that we needed to start to consider the inductance of package pins and the power supply effects of simultaneously switching outputs. Up until that point, speeds were slow and, with 5V power supplies, there was a lot of noise margin. But it was in the 1990s that we really started to worry about on-chip signal integrity, especially crosstalk between coupled connectors.

New effects, like crosstalk, go through several phases.

At first designers and EDA engineers are blissfully unaware that there is a problem, since the effect is too small to matter, like we were about package pin inductance in the early 1980s. But with each process generation, the designs get faster, the distances get shorter, the voltages get lower. It no longer suffices to completely ignore the effect. Sometimes it is fairly easy to deal with by adding some margin, such as using a lower power supply voltage than nominal to take account of the effect without having to do detailed analysis.

However, that approach doesn’t work forever and the second-order effect becomes first-order. The first thing to do is to provide a verification tool that identifies all the problem areas so that they can be fixed by hand. That typically only suffices for a process generation or two. When the verification tool identifies a few dozen problems to be fixed by hand, that is tractable. At the next process node, when it identifies a few thousand problems, it is no longer feasible to fix them all by hand. Just having verification tools is not enough. The tools that create the design need to be made aware of the issue so that they don’t create problems. Placement-aware synthesis, signal-integrity-aware routing.

A subsequent phase is often that the effect exists not just on the chip but affects the package, the board, connectors, and more. As speeds increase, everything starts to be dependent on everything else. New tools are required to handle the analysis at the system level, too.

CadMOS
Let’s use signal integrity as an example. When clock rates were low and power supply voltages were high, signal integrity effects like crosstalk were not a major worry (except for RF design, which always has to worry about everything). Even when we started to be aware that crosstalk was becoming an issue, it could be handled with simple rules such as not routing two signals in parallel for too far across the chip. But another process node and it started to be a requirement to be able to check designs for crosstalk problems.

CadMOS was a startup company that created just such a tool. In fact, it had several tools for checking different aspects of signal integrity:

  • Pacific, a full-chip noise analyzer
  • Celtic, a cell-level noise analyzer
  • Arctic, an ERC (electrical rule checker)
  • Seismic, a substrate analysis tool.

They started to be successful at selling these tools to many of the most advanced design groups doing designs in the most advanced process nodes where the problems were most acute. Cadence had a signal integrity product called Assura SI, but CadMOS was the market leader. We added to our portfolio by acquiring them, and replaced Assura SI with Pacific and Celtic.

As EE Times reported at the time (2001):

“We felt clearly that noise and signal integrity is going to be the next big issue facing people trying to get deep-submicron designs out,” said Paul McLellan, vice president for Cadence’s custom IC product line. “We felt the fastest way for us to get that technology out was to acquire CadMOS.”

Hmm, that name is strangely familiar!

One challenge I’ve noticed with EDA acquisitions like this is that you need two engineering teams but only have one. CadMOS was around 25 people when we acquired them. Obviously part of the reason we acquired them was that they had a strong, growing business selling standalone signal integrity analysis tools. But as I said above, it becomes necessary to make other tools “signal-integrity-aware.” So you need a second engineering team to work on the integration roadmap. It is a big challenge to manage the transition to get both tasks done, without blowing up the running business, but getting the integration done.

That guy McLellan was already talking to EE Times about this later in the same article:

Other CadMOS technology will be integrated directly into the Cadence SP&R products, McLellan said. And even though the CadMOS tools are most often used after layout, Cadence will also deploy the technology before layout to help designers avoid signal-integrity problems, McLellan said.

Sigrity
Signal integrity is also an issue at the board and package level. You can read that story in my post Brad Brim and the History of Signal Integrity. As Brad pointed out to me then, digital signals are now operating at frequencies that we used to call “microwave.” Cadence acquired Sigrity in 2014. That technology has now been completely integrated into the Allegro product line and was on show at DesignCon recently. For details, see my post Sigrity Aurora: In-Design Analysis.

Signal integrity at the system level requires accurate modeling of the components. For example, a 122G long-reach SerDes might start on one chip, go through a package, across a board to a connector, through a cable to a connector on another board, across that board, through another package, and finally into the receiver on the destination chip. This doesn’t just require accurate signal integrity analysis, it requires accurate 3D modeling of the connectors, traces, and cables. Hence Cadence created the Clarity product.

But it gets worse since signal integrity and timing are affected by temperature, and temperature is affected by power, and so on. Thus we created Celsius to do simultaneous thermal and electrical analysis (including airflow using computational fluid dynamics).

The pattern
So the pattern is an effect going from too negligible to bother with, to requiring verification tools, to requiring awareness in other tools, to system-level effects. Accuracy and efficiency require that the tools are tightly integrated with a single engine for each domain used throughout the entire portfolio of products.



Leave a Reply


(Note: This name will be displayed publicly)