Managing Physical Effects

There’s no stopping Moore’s Law, but the accompanying physical effects from manufacturing are giving design teams plenty of trouble.

popularity

By Ann Steffora Mutschler
Managing the physical effects from manufacturing is becoming increasingly critical as designs grow in size and process geometries dive lower.

Just keeping track of these effects in a billion-gate design is a daunting task. At advanced manufacturing nodes, the capacitance and inductance effects make the design much harder—and that includes both on-die and off-die capacitance and inductancel.

“In the past, when we talked about physical effects, we were very much talking about the transistors and above, such as the different layers of metals and resistance,” explained Dian Yang, Apache Design Solutions’ general manager and senior vice president of product management. “Now, it’s not only the metals. The substrate is becoming very important to consider in terms of physical effects impacting a design. Simple resistance models for physical effects are no longer sufficient to describe the vias, and now with deep submicron design those simple resistance models are no longer true.”

There are also heat-resistance effects, high frequency resistance effects, and pattern-based effects.

Complicating matters is an explosion in the number of design rule checks. For 28nm the simple decks are 3,000 checks, depending on the foundry, according to Joseph Davis, product manager for Calibre interactive and integration products at Mentor Graphics. In comparison, a simple fabless foundry deck for 90nm contained about 1,200 checks, while 180nm had between 800 and 1,000 checks. For a 20nm deck, there are between 3,000 and 7,000 checks.

“Determining which checks are critical to your design becomes tougher. And we are talking only about design rule checking. What’s happened now is that you’ve got DFM checking, litho checking, etc. and the real issue is you need to do all that but you still need to get your tapeout into the fab and yielding silicon by Christmas,” said Michael Buehler-Garcia, director of marketing for Calibre Design Solutions at Mentor Graphics. “Yes, the complexity is there. And yes, we think we are handling it. But how do you present all of this to the designer so he can get his tapeout done on time and make the right choices? One of the big discussions we’ve had is that one size doesn’t fit everything anymore because there are the basic rules. If you’re not pushing antenna rules, if you’re not pushing things, then you can relax. But how do I know which one to tradeoff? And oh, by the way, I still have to get the thing out in time.”

Consumer demand also plays a role in management of physical effects. Consumers want cool new applications and gadgets on their smartphones, which requires analog/mixed-signal content such as radios, sensors, I/O, controllers and power management. The following chart from International Business Strategies shows the percentage of designs that start with analog/mixed-signal IP and how fast this has grown. (See Fig. 1)

Fig. 1

Traditionally, analog/mixed-signal designs have tended to run on older manufacturing technology nodes. That’s not true anymore, said Buehler-Garcia. “The thing that has really changed is the ramp for a technology. It used to be that you’d enter a new technology with digital and then a year to 18 months later, you’d start seeing mixed-signal designs with analog content. With the drivers for the semiconductor industry moving from desktop and servers (pure digital) to consumer products which need to have sensors, and radios and so forth on them, now we are seeing the very first designs for every technology having analog content on them, which changes what the foundry has to focus on; it changes the whole ecosystem of what needs to be delivered.”

Analog developers historically stayed away from the leading edge. Many fabs still use processes as old as 250nm, although the core market has shifted tothe 130nm to 180nm range. But with CMOS RF companies can now build those designs on 65nm processes and below, which is a massive technology jump. That puts increased pressure on the foundries, which are having trouble meeting those demands in terms of the accuracy of their PDK and the types of devices they make. For these reasons, the EDA industry works very closely with the foundries in the development of reference flows and tools.

So are today’s design and verification tools keeping up? The answer is mixed.

Yes, in that process technology continues to advance and designers want to use advanced technology because the foundries provide that – they have a lot of benefits. “If the tools don’t work, the designers somehow make the tool work using work-arounds and different ways to handle that. On the other hand, the EDA guys try to improve the tools to meet the challenges,” said Apache’s Yang. “On the other side, no, I don’t think that our tools can keep up with the advanced technology as much as the designers want. The physical effects are very much process dependent. If you don’t keep up, well, then you may be behind.”

Another serious consideration is the impact of physical effects on the power in the design. “Power is very much related to the physical effects. When people didn’t care too much about the power in the early days, they could give 2.5 volts as the operating power. That made the margin huge, based on the transistor’s threshold. This gives around 1 volt to play with. With today’s operating voltages at 1.0, 0.9 or 0.8 volts, now your margin is in the millivolt range. Because the margin is so small…if you don’t model and simulate it well, the noise margin can either be overestimated or underestimated,” he pointed out.

Multiple power domains are a very big issue to contend with. S 30% to 35% of designs today use multiple power domains, said Dave Desharnais, group director of implementation and analysis/verification product management at Cadence Design Systems. “When you start messing around with models and abstractions, it is very, very important you have a way of modeling power domains in a very clean way so it carries the information along. Does it impact power? You wouldn’t be able to manage your power effectively otherwise unless you had some clean way of representing power.”

Mentor’s Buehler-Garcia believes more discussion on power will occur over the next year or so. “If I’m trying to adjust power down at the physical layout, then I can fix 10% to 15% of the problem. If I’m thinking about it at the architectural level, then I have a much better ability to impact it. But if my architectural level doesn’t understand the complexities and nuances, then that hypothesis is shot. How do we get fast prototyping? How do we work with an Atrenta-type offering where then it can quickly go down to Calibre and say, ‘Yep, that looks good,’ and pop back up.”

The importance of power
Power is an effect that now has to be dealt with at every process node, regardless of whether it’s dynamic or static leakage, noted Saleem Haider, senior director of marketing for physical design and DFM at Synopsys. Power has been one of the key drivers behind the different gate manufacturing technologies, and on the tools side, the efforts have focused on meeting specifications for both leakage and timing.

“From the design side, for most designs, timing is paramount,” said Haider. “If there is a certain spec for how fast the design device has to run, that’s always been king. But now, with mobile devices and so forth, it’s not clear that’s always the case. We have some customers that will take a hit on timing if they reduce their leakage power a little bit more. The tradeoff on the tool side for us has been doing everything we can to honor the timing spec while at the same time adding new algorithms to drive down leakage.”



Leave a Reply


(Note: This name will be displayed publicly)