First of three parts: Accuracy vs. speed, advanced device parasitics, counting attofarads, plotting contact capacitance.
By Ed Sperling
Low-Power Engineering sat down to discuss parasitic extraction with Robert Hoogenstryd, director of marketing for design analysis and signoff at Synopsys, and Carey Robertson, product marketing director for Calibre Design Solutions at Mentor Graphics. What follows are excerpts of that conversation.
LPE: As we move into 32/28nm, are the parasitics getting worse and is it getting harder to synthesize?
Hoogenstryd: At a high level, it’s more of the same. It’s getting harder and more detailed with every generation, but the challenges are no more difficult than what we had to address at 65nm and 45nm. It’s those second- and third-order effects coming into play that have to be modeled. There’s a little more complexity in the structures the foundries are trying to make at these geometries, which leads to having to model things that may not have been in those structures in previous technology generations. One thing we are seeing that’s new is more interaction between the devices and the interconnect, which has to be modeled. Those are often called advanced device parasitics. Things that were built into the SPICE models based on high-level device parameters are being modeled more directly through extraction than actually being in the SPICE model. We’re seeing more of that from generation to generation.
Robertson: The interconnect lines are getting smaller and smaller, and now every effect is important. Previously we ignored effects around the device and it was lost in the wash. Now customers are concerned about every attofarad. What is the contact capacitance? What is the infringing capacitance of the gate? We’re trying to understand that a lot better, not necessarily modeling it in the device and then separating it out. Several technologies ago, anything under a femtofarad was noise. Now customers are looking at 10s or 100s of attofarads to do their standard cells and SRAMs and analog IP very accurately. They’re looking at every little effect. In addition to looking on the total capacitance values, they’re pushing very hard on coupling accuracy. They want to do timing as well as signal integrity impedance matching, which means every component is now critical. The other things we’re wrestling with are new devices. FinFETs have much different cross sections than traditional MOSFETs. We’re looking at what’s going to be in the SPICE model and different parasitics, and those are things we didn’t have to consider before.
LPE: Is there too much data to deal with effectively?
Hoogenstryd: Yes. It’s not specific to 32/28nm or 22nm, though. We’ve been wrestling with this at every node. In my opinion, extraction is a means to an end. It’s a way to try to model the interconnect for analysis. That’s really the end goal. The customer needs to analyze their design accurately. The challenge is that they want to be able to do it in their lifetime. So you have to model the interconnect accurately and efficiently. At Synopsys, we’ve been wrestling with the right amount of detail for a specfic problem and then providing techniques to extract the right information, provide only the level of detail for what’s needed, and to be able to take advantage of all the tools that are out there. Take rail analysis, for example. In rail analysis what’s important is that you model the capacitance on the signals and you model the resistance on the power rails. Those become the dominant factors. We saw situations where customers would try to extract everything on the signals and the power rails with brute force. You can’t simulate it. The solution is to extract the right level of detail from those structures, which might seem very similar, so you can drive an efficient analysis. If you’re on the tenth decimal place on the power rail, it doesn’t buy you much on accurately modeling the current through the power rails.
Roberston: There are different levels of abstraction. If you’re just worried about timing signoff, there’s more data so you reduce it and you get accurate R’s and C’s and the smallest netlist possible to do simulation. That’s not what customers are looking at. They’re doing timing signoff, signal integrity signoff and power analysis. There is more data, but just reducing it to deal with higher levels of abstraction is not enough. You need to compartmentalize the problem. There is a timing signoff flow that employs very heavy RC reduction for total capacitances. But there’s also signal integrity where that’s not going to work. You need to preserve coupling capacitances, and then you definitely need a different coupling reduction paradigm to still achieve signal integrity. If you’re looking at power analysis or electromigration, the reduction techniques or approaches that were used for timing analysis don’t apply here because you need to find current density for every segment. There’s not only more elements, but there’s more information to do some of these analysis tasks. You’ve got different flows downstream of extraction, and that drives different extraction needs, whether it’s for timing, current density or corners. Information has to be dealt with differently, depending upon the end goal, but it is still a means to an end. It tells you more about your circuit.
LPE: What you’re talking about is divide-and-conquer and layering strategies. Does that cause problems later on when you have to integrate all of this stuff back together?
Robertson: Absolutely. One of our customers is trying to avoid double counting, which is a classic problem in parasitic extraction. We have this paradigm of building up, then trying to do some analysis at a higher level without double counting or missing effects inside. We’ve gotten more sophisticated. We have schematic simulation with estimates of post-layout effects in our device models. But when you get to complete post-layout how do you turn that off effectively without missing anything or double counting? That’s going from device model to parasitic model, and then you go from parasitic model of the transistor to cell model and higher and higher levels. As people go from blocks to chiplets to signoff, identifying what you captured the first time and how you account for the new effects is very, very difficult. The electrical field doesn’t operate hierarchically. It permeates them both and understanding white box, gray box, black box methodologies to get accurate simulation is very, very tricky.
Hoogenstryd: I agree.
LPE: Is the barrier to entry into this space getting harder for independent tools vendors?
Robertson: My biased answer is yes. There have been difficulties in the past for (Magma’s) QuickCap and Raphael as a field solver to play in the design flow space. Not only do you need accurate R’s and C’s, but it’s a means to an end. They need to be hooked up to the device model, they need to avoid double counting, they need to work hierarchically and flat. That means they need to work with an LVS infrastructure, a place-and-route infrastructure, leading into static timing or SPICE-like flows. It’s not enough to do R’s and C’s. You have to provide a circuit model to the downstream simulation tools, which means you have some integration to design environments, device extraction flows and simulation.
Hoogenstryd: The challenge for any startup is it’s easy for them to focus on a niche but tough to grow beyond that. It’s easy to attack a small problem or application and be very good at it, but to handle the next application and tool flow takes a lot of effort. If an extraction company doesn’t think about the analysis at the end and they’re just focused on the most accurate R’s and C’s then they’re going to have a problem.
Robertson: Historically the market has answered this question. Name the standalone extraction tools that have survived. It’s miserable as a business. Simplex did well because they tied to VoltageStorm and some of their analysis. It was a means to an end. But as standalone extraction entities like Ultima or Sequence can’t survive. Even QuickCap, the industry leader for reference software, has done okay in scientific communities but not gained much market share. And it’s even harder at the newer nodes.
LPE: As we move into power islands and multiple states, does it get harder to build a model.
Robertson: From a parasitic extraction standpoint, no. But this is a means to an end. As a solutions provider, the answer is yes. You can say it’s a simulator problem, but as a solution provider we have to provide integration. It’s not just about R’s and C’s. We have to do something intelligent with them. It’s either integration with simulation or integration with verification. At different voltage nodes or power domains you potentially have more critical circuitry. You need higher-accuracy parasitics vs. the digital domain where you may not need the coupling accuracy.
Hoogenstryd: We’ve had some customers say they want to do different extractions at different voltages. The real challenge is how you integrate that into your verification methodology. This is back-end verification for performance to make sure you meet timing, power goals, signal integrity goals. People are doing more extraction and more analysis. They’re trying to be practical in their approaches. Some are doing it well, some are throwing up their hands and saying they have to run 25 corners times X number of modes. They run out of time. People are running more extraction across more environmental parameters and then trying to integrate that into the verification methodology.
Leave a Reply