Is The Definition Of IC Reliability Changing?

Rising complexity has altered what’s good enough when it comes to verification processes.


“You know, brain surgery’s not difficult if you don’t care whether the person dies, it’s actually quite easy. Flying a plane is quite easy if you don’t mind crashing. That’s what hard means. It’s an expression of how much you care about the result. And if you are proud of it, or you believe it can be good and you want it to be good, then it can be sort of infinitely hard, to the point where it can drive you a bit bonkers.”
—Hugh Laurie, British actor

Smaller transistors, mega-chip designs, faster design cycles, thinner wires and gate oxides, increasing sensitivity to layout shapes and patterns—all of these factors contribute to the perception that integrated circuit (IC) reliability is a new animal at leading-edge nodes. But is it really? In truth, the definition of IC reliability is just what it has always been—the expectation that a chip will perform its intended function. What’s changed is the need to serve new high-volume consumer markets where long-term reliability is mission critical, and the difficulty of doing the analysis needed to guarantee that ICs will not just yield at manufacturing, but continue to be reliable over an expected lifetime.

For quite a long period, reliability in the consumer market was important, but not life threatening. Desktop computers, laptops, mobile phones, and tablets were the dominant products driving the consumption of large, complex SoCs. In these markets, the average lifetime of a product is three or four years, driven by technology obsolescence. Verifying designs for manufacturing yield was the primary focus of the semiconductor industry. Advanced design rule checking (DRC) and design for manufacturing (DFM) techniques, such as critical area analysis, lithography checking, and via doubling, were introduced to address manufacturing issues that led to chip failures on the wafer, but long-term reliability was secondary.

In aerospace and military applications, lifetime reliability is obviously an essential factor. But there, cost is not as much of a limiting factor, and rigorous standards and processes are in place. Reliability has become crucial for data centers as all types of businesses rely on them for day-to-day operation. But it is often more cost effective to gain reliability at the system level, using failover and redundancy mechanisms to protect against failures. Again, there is not as much emphasis on eliminating long-term IC failures.

Today, however, there are a host of factors driving an intensified focus on IC reliability. Complex ICs are being widely used in both the automotive and medical industries, where long-term reliability is a critical and life-threatening factor. These applications face different environmental factors and different design constraints than the typical consumer applications. Implanted medical devices must function without failure or loss of performance for extended periods. Automotive electronics must operate in extremely high temperature and turbulent environments, and over an extended lifetime of 10 to 20 years. However, both of those industries are also driven by cost and volume. Automotive electronics cannot drive the cost of a car beyond the market’s ability and willingness to pay [1], while the cost of medical electronics directly influences adoption, now more than ever before [2].

With more complex design requirements, more variability in operating environments, and more emphasis on efficient power consumption, circuit complexity is now a driving factor in IC reliability analysis. Many IC reliability issues are actually the result of design flaws, not manufacturing issues per se, and involve subtle, longer-term effects like oxide degradation that cannot be detected by traditional production tests on the manufacturing line. For example, in a multi-voltage domain design, how does a design team verify that thin-oxide gates have the correct bias, or that high-voltage devices are not driving low-voltage devices past their rated thresholds?

And it is not just the leading-edge nodes that face these issues. New circuit functions and topologies that didn’t exist during the first designs starts at an established node (e.g., multi-power domains and power down modes) are part of these new reliability analysis challenges. Until recently, EDA tools simply did not address many of these issues, and designers were forced to develop their own techniques or perform lengthy, error-prone manual checking.  However, driven by the growing needs summarized above, new solutions are emerging to give IC designers a better handle on their reliability problems. We are seeing a new class of integrated reliability checking platforms that link together the ability to perform circuit classification, physical layout measurements, complex calculations, and rule-driven circuit checking. The combination of these facilities allows designers to automate many of the circuit checks required to ensure reliability. At the same time, guidelines for circuit reliability checking are emerging, some of them proprietary to individual companies, and some defined collaboratively by open interest or standards groups. Both of these trends are having a significant impact on our ability to identify and remove design flaws that reduce long-term IC reliability.

The definition of IC reliability has not changed, but the expanding standards to which we hold our designs, combined with increased design complexity, require us to adopt new reliability verification processes. How much you care about the reliability of your designs determines the level of reliability verification you must perform, which drives the selection of techniques, tools, and the time to market.


1. Electronics to account for 40% of automotive production costs by 2015, Electro to Auto Forum,

2. Medical Electronics Is Not so Niche Anymore: A primer on the surging sector, Smith MarketWatch Quarterly, Vol. 4 No. 2, 2010.

  • J Fjelstad

    A timely commentary, thanks for posting it.

    Reliability is, as pointed out, very important for military, aerospace automotive and medical products but it is also crucial to the world’s poorest consumers. If one is making $2 a day, any purchase of an electronic device must come with certainly in terms of its long term reliability, as such purchases are the rough equivalent of purchasing an automobile.

    Unfortunately, the coming generations of ICs will be more questionable in terms of their long term reliability prospects due to the reasons noted and simple physics.

    One of the big challenges in electronic assembly is enduring the high temperatures associated with soldering as high temperatures can damage and degrade both components and substrates. One way around the problem is to embrace solderless assembly for electronics (SAFE) methods such as is prescribed in the Occam Process concept.