Multi-Physics Combats Commoditization

In a world of billion-gate designs, it is increasingly difficult to create a differentiated product without incorporating multi-physics elements.

popularity

The semiconductor industry has benefited greatly from developments around digital circuitry. Circuits have grown in size from a few logic gates in the 1980s to well over 1 billion today. In comparison, analog circuits have increased in size by a factor of 10. The primary reason is that digital logic managed to isolate many of the physical effects from functionality, and to provide abstractions that made fast analysis and automation possible.

But all of that has stopped. The advantages of each new node are decreasing and raw transistor count has stopped being a way to differentiate one design from another. Even going to the latest node does not escape the need to take on multi-physics problems.


Fig. 1: Scaling issues. Source: Hewlett Packard Lab (Published in Computer magazine, 2015)

Multi-physics analysis becomes necessary whenever two technologies come together that do not utilize a common underlying set of physical principles. This has been the case at the system level for a long time, whenever non-digital components are integrated together. The most obvious example of this is analog/mixed-signal designs, but others include photonics, which is becoming a necessary technology in the data center, and MEMS, which is finding increasing usage in areas such as IoT, medical and automotive.

Even for pure digital systems, the abstractions have been breaking down with the newer geometries. Electromagnetic, thermal and other types of coupling have escalated from second- or third-order effects into first-order issues.

Designs that do not push these limits are likely to become commoditized very quickly, and that means that multi-physics will become a problem that all design teams have to face.

The bottom line is that the semiconductor industry needs to make handling multi-physics designs simpler and easier. If chip and system design are going to continue advancing at their traditional pace, there is no hiding from this reality.

New geometries are multi-physics
In the past many multi-physics problems were avoided using guard-banding or providing large enough margins. However, with the latest technologies, some of the voltages and noise margins are so small that those margins have to be trimmed.

“An example where employment of multi-physics analysis has become necessary is electro-thermal simulations for chip performance estimation,” says Valeriy Sukharev, technical lead and principal engineer in the D2S Calibre Division of Mentor, a Siemens Business. “Heat generated by switching transistors and currents passing through interconnects generates non-uniform temperature distribution which affects the device characteristics and metal resistivity. A simultaneous solution of linked Kirchhoff’s and Fourier’s equations for electrical and thermal circuits is required to resolve this problem.”

And it continues beyond that. “Another issue is emerging,” says Norman Chang, vice president and senior product strategist for ANSYS. “That is thermal-induced stress. An example is with wafer-level packaging. With the die-first approach, you start with a known good die and you put a couple into a wafer-level package. In the interface area between the package and the die, and in particular when an extremely low-k dielectric layer is used, you may suffer from thermal-induced stress. For that you have to do some analysis. The wires are also getting thinner and narrower, and with an increase in current density there will be a lot more stress compared to the previous process generation.”

Advanced packaging has added a number of extra dimensions. “Evaluation of packaging effects in multiple physical domains is important to ensure proper device design,” says Pascal DeVincenzo, CEO for Open Engineering. “Stress induced by packaging on the device can cause serious problems for operation. Stress can cause misalignment in optical devices or faulty outputs in piezoresistive devices. Thermo-mechanical analysis is often required, as materials used in packaging expand at different rates due their varying TCEs (thermal coefficients of expansion). As devices are stacked together in systems-in-package and 3D-ICs, thermal and mechanical analysis is becoming ever more critical.”

Within the chip, electromigration (EM) also adds to Design for Manufacturing (DFM) and Design for Reliability (DFR) concerns. “An accurate power/ground grid EM assessment requires a solution of linked partial-differential equations describing mass transfer, electric current density, mechanical stress evolution, and temperature distribution across the grid,” adds Sukharev.

Some of these problems are not new, but they have shifted left. “Thermal-induced stress is a problem that was previously analyzed using a TCAD tool,” explains Chang. “Today, the problem is coupled with design because the thermal temperature is power dependent and is different for which functional mode you are operating in. It needs to consider the power signature. Power, thermal and EM are all coupled and they all affect stress and reliability. This is not only for SoC but also for 3D IC.”

Intertwined dependencies
Many aspects of design are becoming vector-dependent. “Noise simulations that cover different frequencies in the chip, package and the board should be performed,” says Arvind Vel, senior director for applications engineering at ANSYS. “Vectors that target the high frequency components on the chip typically are used to understand the intra-chip noise issues. Transition states that are medium frequencies in nature are used to understand the impact of the package and board noise coupling. Packaging technologies such as fan-out wafer-level packaging add another level of complexity to the simulations. The ability to simulate multiple dies at the same time is important to check the noise coupling between the different dies.”

Those seeking performance have another problem. “The minute frequency reaches 2GHz, and at technologies below 16nm we start seeing companies having issues related to magnetic coupling,” says Magdy Abadir, vice president of corporate marketing for Helic. “After that the problem mushrooms and we see it in all of the new technologies and in large SoCs that are trying to mix lots of analog and RF technologies.”

Noise analysis becomes more important as margins get smaller. “Total power and power integrity starts to converge when you start scaling the supply voltage,” says Tobias Bjerregaard, CEO at Teklatech. “Switching power scales with the square of the supply voltage, which means that by reducing supply voltage by 10%, you gain 21% in switching power. This is particularly interesting with finFET devices because they have good performance at very low supply voltage, until a critically voltage is reached. Pushing supply voltage down puts an increasing pressure on power noise margins.”

Those looking for performance will crank the voltage up. “Anytime signals change at a fast rate, there is the potential for EM crosstalk,” adds Abadir. “The change causes a magnetic wave and if that passes through any metal structure, and we have very complex metal structures these days, you have to analyze the capacitance, inductance etc. and evaluate if these is coupling as a result of the wave. So while it is a digital problem, Phase Locked Loops and Voltage Controlled Oscillators can be very sensitive to coupling.”

“Designs that have a mixture of analog and digital components especially need to pay careful attention to the coupling of noise from the digital to sensitive analog components,” says ANSYS’ Vel. “Shared power and ground domains are important noise pathways that can couple noise and cause issues during operation. Substrate noise injection is especially important to simulate for RF components integrated along with high speed digital cores.”

Adding new physics
MEMS have become ubiquitous and are powering the IoT (Internet of Things). “There is more to come as system integrators choose MEMS based solutions and additional designers scale down existing devices to the MEMS scale for performance and cost reasons,” says Mary Ann Maher, CEO SoftMEMS. “Typically, MEMS sensors and actuators operate or translate signals from one physical domain to another requiring coupled simulation in multiple physical domains. Since MEMS are moving structures, they require analysis in three dimensions. Coupled simulation in 3D, using finite element and boundary element solvers, is typically used to evaluate the performance of MEMS devices at the partial differential equation level. These solvers require thousands to millions of degrees of freedom to accurately analyze the behavior of the MEMS devices. Coupling can occur in several domains such as mechanical, thermal, electrostatic, magnetic, fluidic, etc. Simulations of non-linear effects such as contact of surfaces are important in devices such as RF-switches and overpressure events in micro-phones.”

And MEMS are not the only new kind of devices being integrated today. Semiconductor Engineering recently published an Experts at the Table series looking at integrated Photonics. “Temperature and mechanical stress sensitivity of the optical elements presents a great challenge for integration in the case of silicon nanophotonics,” points out Sukharev.

Time constant complications
Timing, power, noise, EM, stress are all coupled, but with very different time constants, creating an additional problem for analysis. “Multiscale modeling refers to a style of modeling in which multiple models at different scales are used simultaneously to describe a system,” says Sukharev. “This type of modeling is necessary in many cases where multi-physics analysis is involved.”

Maher explains the impact this has on MEMS analysis. “Multi-physics simulations may occur at the atomistic scale to capture phenomena such as stiction and materials may be deposited with atomic layer deposition. In contrast, packaging and large scale motions/wave phenomena may occur at the mm scale. The time scales of operation of the electronics may come to picoseconds/nanoseconds, where thermal/other events may occur over milli-seconds to seconds. These varying scales imply a hierarchical approach with multiple simulators and numerical techniques.”

Even within a single domain, timescales are creating problems. “Consider the time constants of thermal propagation,” says Steve Carlson, low power solutions architect for Cadence. “The timescales at the transistor level are picoseconds, but at the package level it could be up to seconds or even minutes. There are many orders of magnitude that may be spanned depending on what you are trying to do. Metal conducts a lot better than silicon. Some people consider using a diamond substrate because it is a great thermal conductor and a great electrical insulator. Unfortunately it is not easy to work with.”


Fig. 2: Sampling techniques and propagation times. Source: Cadence

Spanning these time constants often requires multiple models. “The challenge is to make the models consistent with each other and a good methodology must be put in place to utilize the models at the appropriate scale and to combine the results to develop accurate results that can be achieved in a timely fashion,” says Maher. “For MEMS, we have a model builder that automatically creates a model usable for system-level simulation from finite element and boundary element data using model order reduction techniques. Although there is no generalized reduction technique that works for all physical systems, many of the device behaviors that are of commercial interest can be captured. Model order reduction techniques are an active area of research especially for non-linear systems.”

Conclusion
Design teams cannot rely on tools to solve everything for them.

“Designers have to know about the risk factors, such as EM crosstalk or thermal issues,” says Abadir. “They need a combination of design techniques that can be preventive in nature, coupled with analysis tools that are smart. You cannot boil the ocean. The tools have to be able to tackle a very complex problem, and dig deeper in certain key areas in the design than for other parts of the design. When the tools do not think it is a concern, they can operate at a faster pace and create a hybrid model of the problem for analysis. The ultimate goal, before tapeout, is to do some kind of sign-off. This goes over the entire design and makes sure that we didn’t forget anything or leave any corner unturned. You need a combination of all of these. In the past you considered timing or power, but now you have to consider these other factors, as well.”

Related Stories
Power Challenges At 10nm And Below
Dynamic power density and rising leakage power becoming more problematic at each new node.
Optimization Challenges For 10nm And 7nm (Part 2)
Heat is becoming a serious issue as thermal densities rise and this creates problems for industries such as automotive that require robust, long-lived components.
Electromigration: Not Just Copper Anymore
Advanced packaging is creating new stresses and contributing to reliability issues.
Noise Killed My Chip
Which kind of noise is likely to give your next project a headache, and what you can do about it.