How Long Will FinFETs Last? (Part 3)

Experts at the table, part 3: Incorporating new techniques; gate-all-around FETs; cost challenges; EUV.


Semiconductor Engineering sat down to discuss how long FinFETs will last and where we will we go next with Vassilios Gerousis, Distinguished Engineer at Cadence; Juan Rey, Sr. Director of Engineering for Calibre R&D at Mentor Graphics; Kelvin Low, Senior Director Foundry Marketing at Samsung; and Victor Moroz, Synopsys Scientist. What follows are excerpts of that conversation. For part one, click here. For part two, click here.

SE: How do new techniques get incorporated into the design tools themselves?

Rey: There are abstraction models that get captured into essentially geometric dimensions — vertical or horizontal — and both their variability in both directions. All that gets translated by using a model into what the other tools are going to do. There are tools that are used to measure and verify, tools that are used to analyze and identify issues in the circuit, then this is necessary to actually bridge that physical domain with the abstraction on the circuits. If we go back to essentially what those [FinFET] devices are going to be, there were some changes with FinFETs but the changes were not huge — they were add-ons and they have been happening more and more. You look at how the technology has been evolving, and this gets translated into the number of rules that people need to apply when they are doing design, and the number of rules that people need to apply when they are doing verification — and you can see that that keeps growing really exponentially. What are the alternatives here? If the alternatives are variations of FinFET in order to more progressively go towards a Gate All Around, which seems to be where we are converging and is a most likely scenario, when we start actually jumping to vertical ones, then very likely several things are going change much more than the natural progression than we have seen.

SE: Are the tools for Gate All Around going to be fundamentally different?

Rey: No, I don’t think the tools are going to be fundamentally different but there would be very important new concepts that need to be incorporated, that very likely are going to take much longer.

Gerousis: Again, it will be changes.

Moroz: If you ask for a fundamentally different tool, that’s a problem because how many years would it take to bring one to market, then you are too late with your design, so it cannot be. There is an interdependence so it has to be gradual and incremental, add new things one by one.

SE: At a certain point, won’t the tools need to be fundamentally re-architected to take full advantage of the process?

Moroz: Do you mean quantum computing?

SE: Sure. Or whatever it’s going to be. The tools that we use today can only be extended so much, right?

Moroz: If there is a breakthrough in how you do circuits, then yes. But so far, for the last 40 or 50 years, everything was incremental.

Rey: What Victor said is exactly right: if and when we jump for at least some niches — at this moment — for some of the quantum computing models, there you have some theories surrounding FinFET, but so far that type of model can be applied apparently on some very specific problems and not replace general computing. So the niche areas seem to require in some areas some radically different tools — not all the tools. There are some new tools that need to be applied that are ready for the FinFET.

Gerousis: Even for the evolution for manufacturing, it’s the same thing: you cannot just throw everything you learn about every part of the layers and do it all from scratch — it has to be changed gradually.

Low: For us it’s the billion dollar investments. How do we make sure there’s tool reuse, and still have maximum yield.

Gerousis: FinFET did provide very good performance — it’s easier now to close on timing; much better than previous generations such as 20nm.

Moroz: Especially because of variability.

Gerousis: So it brings down good stuff, plus the challenges which we like as engineers.

Low: From the tool side, what we are seeing that’s really exciting is the tool efficiency and improving the reticle. That helps with time to market.

SE: When you say ‘efficiency,’ what do you mean?

Low: Throughput and runtime. We saw drastic improvements from Mentor, Cadence and Synopsys — and that was important for customers to deliver their products on time.

Gerousis: Your node is bringing you double the number of gates and bigger area, etc., so you will need some improvement in runtime. Not just in runtime, but also in how you do the design. The full design needs to be also improved.

Low: This may not have to do with the process technology; just the complexity of design warrants this.

SE: Just looking at today, what are users experiencing as far as FinFET design challenges?

Low: I think if this discussion happened two years ago, it would be different. Now because of more experience already handling FinFET, the process is more stable in mass production stage. There are still challenges. Maybe I would call them uncertainty or lack of awareness — sometimes you’re not familiar with a thing; you view them as challenges.

Rey: I think most of the issues got resolved with intial, early IP development, and it has a lot to do with the simultaneous interaction of what FinFET as well as new technologies much more widely used — like double patterning, and so on. It has impacted several other things that people needed to address. Again, it was not something radically different — it was different in terms of a new breed of checks that had to be added. But then for the digital designers, the challenges seem to be much less in terms of growth and adoption.

Moroz: And people learn as they go. If you look at Intel, they now claim the FinFET process has highest yield that they ever had — better than planar — which is surprising. And if you remember, there’s Koomey’s Law, from the 30s, which is more general than Moore’s Law that tells you that people improve things as they go. And it applies totally to the semiconductor industry.

Gerousis: I believe from the customer’s adoption, from the tools, from the manufacturing side, all the challenges that we know of have been solved.

SE: Wow, that’s fantastic. So we’re done.

Moroz: We are never done.

Gerousis: When you get into 14nm it’s totally different than what you get on 10nm so the challenges will be totally different and there’s a learning curve associated between 14 and 10nm — like a 2x change — and that’s for manufacturing as well as tools.

Low: For 14nm, it’s about how to handle discrete device design. 10nm, there are certain effects but then there are other process elements and enhancements that we are doing beyond just the FinFET to help in the entire process flow. That could translate to new requirements on the EDA tools. Simple things like more design rules will be introduced.

Moroz: Again?

Low: Also, parasitics — that is a major effort on all sides.

Rey: That’s where multi-patterning is going to be more challenging. There I would expect the early designers are going to bring us interesting challenges.

Low: And as long as extreme UV (EUV) doesn’t come online, we’d have to innovate in terms of what we do from a lithographic point of view based on available technologies.

Moroz: Can you say octuple?

Low: There’s another very important angle. Because the litho engineers can always do patterning, then variability becomes a big problem, but the more important problem is cost. Customers are not waiting for EUV, so what else can you do? Design tradeoffs, design rule restrictions. These days I realize that cost and time is king — R&Ds will be bounded by this.

Moroz: But the cost is not as much to do with FinFETs — it’s totally driven by back-end-of-line.

Rey: Once you start going beyond double patterning, triple patterning, quadruple; we’ve been getting a lot of five exposures also. There, some of those few layers that are the most critical become very expensive too.

Low: How you manage variation, signoff corners change. That’s why the tools become important again — handling more than just double patterning.


Kev says:

“Gerousis: I believe from the customer’s adoption, from
the tools, from the manufacturing side, all the challenges that we know
of have been solved”

There used to be a Cadence employee that turned up at the Accellera Verilog-AMS committee that would say there were no problems because the customers weren’t complaining. However, the customers I talked were anything but happy.

Denial is the default option for people out of ideas, and the idea that a design flow that worked in 1990, is going to work as well in 2016 is deluded. For an analogy you can consider the design methodology being like DOS, and at some point you need to transition to Windows-NT, but so far it has just been patched-up.

The cure for variability in digital design is to move to an asynchronous methodology, but nobody has that (it’s the “Windows-NT”), and it was conspicuous by its absence in Synopsys’s recent rewrite of DC.

Leave a Reply

(Note: This name will be displayed publicly)