Analyzing The Integrity Of Power

Making sure the power grid is strong enough to sustain the power delivery.

popularity

Power analysis is shifting much earlier in the chip design process, with power emerging as the top design constraint at advanced process nodes.

As engineering teams pack more functionality and content into bigger and more complex chips, they are having to deal with more complex interactions that affect everything from power to its impact on signal integrity and long-term reliability. That, in turn, has created a demand for more analysis tools that can be used earlier in the design cycle to make tradeoffs.

“One analysis that predicts all of the mechanisms of current flow in a chip is still a pipe dream,” said Aveek Sarkar, vice president of product engineering and support at Ansys. “Unlike timing, power is a global problem and a multi-physics problem. So it needs to work for all possible conditions. But what’s happened is that designs are so pessimistic that they overdesign the board, the package and the chip.”

It’s easy to understand why this occurs. Complex chips contain hundreds of millions of instances and sometimes more than 100 power domains, which are handled by multiple teams working on related problems—but not necessarily at the same time. Each group is on a tight schedule, and those schedules are interlaced with various tools that may or may not keep up with rising complexity at each new node. Moreover, each group is being challenged increasingly by low-power techniques, which until a couple years ago were largely left to power engineering specialists.

“For example, power switching cells are very important in low-power designs because they have to control the power domains properly to make sure that the IR drop is not going to destroy the functionality,” said Jerry Zhao, product management director at Cadence.

Use of these design technique demands careful scrutiny not just in digital design, but increasingly in analog/mixed signal designs, as well. Possibly most important are the manufacturing technology challenges.

“Once you go to finFET, the individual generations of those nodes have different electromigration rules, and they are so complex that there is nearly a year-round activity between EDA vendors and foundries to cooperate to make sure that the foundry rules are obeyed in the EDA tools,” said Zhao. “FinFET is definitely causing a generation gap compared to previous technologies. Once you want to go from 16nm to 5nm, there are changes in terms of the rules, there are changes in terms of the power supplies. In very low power designs, we are not talking about 0.9 V or 0.8 V. We’re talking about 0.5 V. That definitely is going to pose challenges to the tools as far as how to make sure that the power grid integrity – the IR drop, for example – is not going to be over a certain threshold. If you determine that threshold is going to have to be considered with other aspects of your design, that brings methodology changes, namely that power integrity is not a standalone problem. It has to be considered in the implementation flow. This means when you want to start the physical design – even in the early days before physical design – you need to worry about how to make sure the power grid is strong enough, but not over-designed, to sustain the power delivery. That is so different from previous generations. Through innovations, you need to consider timing, area, power, and then it should be an integrated implementation flow all the way to tape out.”

To be sure, power integrity analysis has shifted from individual analysis of die, package and PCB, with on-die analysis limited to IR losses, to full system analysis where die, package and PCB are all analyzed together, noted David Bull, senior principle research engineer at ARM. “This has been become increasing important as the current trend for compute is for higher power efficiency within the same power envelope. That results in lower operating voltages but higher currents and subsequently higher losses due to Ldi/dt, and requires the entire power delivery network to be modeled and analyzed using a full-wave solver.”

At the same time, power is still being left on the table because we have to be so conservative about integrity in the early phases of the design, said Drew Wingard, CTO of Sonics. “If there was a way of doing integrity analysis or integrity production earlier in the design flow, we could make different architectural choices that would allow us to save more power. It’s like a lot of the rest of the architectural modeling stuff. I don’t need a perfect answer today. But I need a way of bounding what is reasonable to expect and trading off if I want something better. Also, how much is it going to cost me, and where is it going to cost me? To be able to do that kind of tradeoff analysis while I’m still in the architecture phase is not commonly available today.”

Dollars and sense
plays a significant role here. As the difficulty of developing chips at advanced nodes increases, that has a direct impact on cost per transistor. To keep costs scaling with feature sizes, design teams have to approach problems such as optimizing dynamic power integrity and routing differently, said , CEO of Teklatech.

“You can’t just cram the transistors or the cells closer together, because it’s not the fact that the cells made the space,” said Bjerregaard. “It’s the fact that you can’t route the design when you put them together. That is the limitation in utilizing the area, because when you fabricate a chip you don’t pay per transistor. You pay per area. And it is up to you to put as many transistors in that area as you can.”

This directly impacts power integrity, because as power density increases and supply voltages go down, the power integrity margins and dynamic voltage drop margins fall, as well. Compounding the problem is that with increasing power density and increasing metal resistance, the IR drop/dynamic voltage drop is getting worse. To deal with this, more metal is added to strengthen the power grid, which means that metal can’t be used for routing, thereby exacerbating routing issues.

To maximize area utilization of the latest process technologies, one approach is to improve the power integrity upfront so not as much metal needs to be added to the power grid, Bjerregaard contends. “Then you have that metal for the wires, you can wrap your design, cram the transistors close together and you can harvest the economic benefits of scaling.”

To achieve this, he proposed that the design should be optimized at an early stage using a technique called dynamic power shaping, which shapes the dynamic power signatures of the circuit. “This is a really important part of utilizing scaling and making sure Moore’s Law doesn’t stop, because in the end what is driving scaling is financial. You have to make money to make more money. There is scaling so you scale.”

The bigger picture
This is more than just a chip problem, however. Dave Wiens, a Mentor Graphics business development manager, said it’s hard to think about power integrity at the chip level without also including the board.

“The signal integrity problem has been around for 30-plus years and the tools have been around for about the same time, and some of it has evolved dramatically,” Wiens said. “Devices have always needed power, but it has been within the last 10 years or so that the demand for clean power has really risen, and that’s in large part because of silicon integration and the high current that’s been available to these fast parts.”

Wiens pointed to such changes as the doubling of the number of voltage rails over the past 10 years. “The impact of this on design is a constant set of tradeoffs that have to be made. You’re sitting here with a design that’s always facing a compression and layer count because of cost, primarily, and sometimes form factor, so you’re always facing that drive to reduce layers while at the same time the number of rails is going up. On good old fashioned designs, we had a solid plane layer — an entire layer. Every layer now is a mixture of signal and power. That creates its own set of problems. In addition, you’ve got very high-density BGAs hanging out all over the top and bottom of the board, and they are fanning out. That fan out is perforating the plane creating a Swiss cheese effect, which further impacts the return path for the current on the plane.”

As such, this drive to density is counteracting the increase in the number of voltages. It is a constant tradeoff of trying to get clean power, but at the same time trying to reduce the density and seeing where to make different fan-out structures, where to move planes around within the signal layers while still getting clean current on the devices.

Complicating matters are the DC and AC considerations of the PCB, where DC is voltage drop across a plane; AC is figuring out the decoupling strategy as far as where capacitors are placed, how many are needed.

For both of those, Wiens said, it’s an engineering problem as well as a layout problem, and the layout is the part that’s complicating the design. “The engineer designed it nice and clean, and then the layout guy goes and messes it up for him. That drives who can access the tools, when, and how they’re used within the design process. You need integration so people can do those tradeoffs without the layout guy saying, ‘Mr. Power Integrity Specialist, I’m going to take a break for three hours or a day or two while you go analyze this whole thing.’”

Further, Dave Kohlmeier, product line director for high speed tools in the PCB Division at Mentor Graphics said the evolution of power integrity in the PCB realm has been driven by IC device characteristics (for instance lower voltages and increased current requirements have driven demands for higher performing PCB-level power distribution network design.) The number of PCB voltage rails has increased due to integration/aggregation of more and more functionality (including multiple signaling protocols) on silicon. Based on Mentor’s TLA data, the number of voltage rails has more than doubled over the last ten years, and the focus on ‘design for power integrity’ has gone up dramatically.

According to TLA statistics, there are fewer active devices in these designs but they are huge. “In addition, the device count on the TLA statistics is increasing but it’s mostly 2-pin components: termination resistors or bypass capacitors but as far as intelligence, it’s becoming these big huge BGAs. The trend is to include these passive devices on-chip or on-package, so at some point we may see the on-board device count drop.”

“The IC geometries have been getting smaller and smaller, which has required them to go to lower and lower voltages — so as we’ve gone to lower and lower voltages, the margins that the designers have to meet with those lower voltages is smaller. At the same time, power requirements have stayed the same or have gone up, forcing current to go up and PCB target impedance to go down as well. So the bar has been raised for PCB power distribution network design – it drives the need for better stack-up design, power plane ‘routing’, decoupling capacitor placement/mounting, etc. Here, simulation provides insight as to what will happen on the board,” he explained.

Ansys’ Sarkar agrees: “You’ve got to look at this from multiple angles. There are DC angles for gross connectivity, timing simulation, real-time effects. And that’s just for the functional operating mode. You also have to look at test modes. What happens when you power this device up and down? The challenge for design teams is getting sufficient signoff confidence. If you’ve done all of your checks, do you feel 95% confident it will all work?”

Where do we go from here?
In this market, R&D is ongoing. For the PCB side, Kohlmeier observed the next hill to climb is combining power integrity with signal integrity. “Right now that’s the big piece, because as the voltage supplies vary, and have some noise on them, those affect the signaling between the ICs themselves. So power aware signal integrity is the important thing now. It’s being able to couple those together. The hard part there is getting a model for the link between the power to the IC and the I/O switching part of the IC, and it’s generally not something the IC guys supply. It’s part of their intellectual property. They’re not going to tell the general public what that looks like, so it’s becoming a modeling problem to combine the two.”

Wiens noted this is a problem that is geometrically predictable. “It’s multiple rails, it’s across the entire board so we’re trying to map in appropriate engines for the problem. Things like simple geometric checkers that isolate down the issues to spend additional time on — those become more critical as well.”

Things are just as complicated on the IC side. “If I could wave my magic wand, we would have a reasonable environment for modeling the implications of the power supply network,” said Sonics’ Wingard. “Then we could make better choices about how quickly we could afford to shut something off or turn it back on, and use that to be able to more aggressively shut things off. Then the chip would still have bounded latencies for when I wanted to turn it back on.”

Related Stories
Why Power Modeling Is So Difficult
Demand is increasing for consistency in power modeling, but it has taken far longer than anyone would have guessed.
Powerful New Standard
The version of IEEE 1801 enables complete power-aware flows to be constructed using a meet-in-the-middle concept. But when will these flows become available?
New Approaches To Low Power Design
There is work to be done in energy-efficient architectures, power modeling and near-threshold computing, but there are many more options available today.



Leave a Reply


(Note: This name will be displayed publicly)