Plugging Gaps In Advanced Packaging

Design-packaging-board flow getting more attention as multi-chip solutions proliferate.

popularity

The growing difficulty of cramming more features into an SoC is driving the entire chip industry to consider new packaging options, whether that is a more complex, integrated SoC or some type of advanced packaging that includes multiple chips.

Most of the work done in this area so far has been highly customized. But as advanced packaging heads into the mainstream, gaps are beginning to appear.

“IC packaging has made remarkable progress in the last 10 years, but the complexities that IC packaging can handle today are fairly limited,” said Herb Reiter, president of eda2asic Consulting. “As we are seeing more of the value creation shifting to the package and the cooperation between package and silicon, we are running into a big headache because there is no such thing yet as a die-package co-design flow. It’s very difficult to feed information from the silicon world into the packaging world, and even more difficult to get feedback from the packaging world information into the silicon world to co-optimize these two domains.”


Fig. 1: More than Moore. Source: Cadence

To be successful, packaging has to come close to the PPA benefits provided by Moore’s Law scaling, with the added benefit of faster time to market.

“But as engineering teams are putting these multi-die heterogeneous designs together, they are struggling more and more as they pack more content into a package,” said Keith Felton, product marketing manager at Mentor, a Siemens Business. “On paper it makes a lot of sense. You can mix technology nodes, processes, you could put together disparate heterogeneous chips of different functions, combine them with interposers into a package, and then you basically have a very nice functioning subsystem that is almost as efficient as if you could do it in a single SoC. Theoretically it’s close to SoC performance with much lower cost and risk. The area is a little bit more, it is a little bit fatter in height, but they are doing it today.”

And the idea is catching on everywhere. “Even Intel got off the ‘classical’ Moore’s Law algorithm of smaller-smaller-smaller because the stuff just didn’t work,” said David Park, vice president of worldwide marketing at Optimal+. “What they did is they started going to multi-chip packages, multi-chip modules, because it’s much easier than trying to put everything on say 28 nm. ‘The digital logic we will put on 28nm, but let’s leave the analog stuff at 45nm because that works well, and we’ll connect them on a substrate.’ That’s how people have kind of gotten around the Moore’s Law thing — they are sticking multiple pieces of silicon together through some sort of interposer. That’s how they are combining small feature sizes where it’s beneficial and bigger feature sizes where that provides a higher level of robustness or quality.”


Fig. 2: Intel’s EMIB approach. Source: Intel

The integration gap
While the detail design and layout part of the process is fairly well understood, it’s not so easy to put chips together with an interposer and to figure out the right way to connect everything. “Typically that was done in the past on paper and the famous bump ball spreadsheet that everybody talks about, which is synonymous with this process,” said Felton. “But that’s static documentation. These are not really design tools. They don’t help you look at tradeoffs. They don’t give you any feedback as to whether you’ve connected things incorrectly. They merely document what you’ve done in order for you to provide it to someone else, so they are a static piece of documentation and that really isn’t working. While it may look good, it may not function from a signal integrity point of view or from a thermal analysis point of view.”

Verifying the design once the physical layout is done adds another potential minefield. If a mistake is made in how things are connected logically at the beginning, when you get to the final design stage and and discover a problem during verification, that requires a massive amount of rework. “This can be very difficult if you have a locked in time with your foundry or your OSAT,” said Felton. “You’re actually going to miss a slot in their production schedule.”

To avoid these problems, some design teams have started focusing on creating upfront prototypes of the design at a system level—basically creating a digital model of the entire design. That allows them to trace conductivity from the top die all the way through to either the package ball or even onto the printed circuit board, as well as to make design changes from that model which can then be sent to simulation tools. But combining data from many sources to build a logical prototype of the entire package assembly is not a trivial exercise.

“Even if you are a fabless semiconductor company you’re not designing all the chips that are going into the device,” said Felton. “You’re probably getting a memory stack or a memory cube from someone, you are maybe using the processor you’re designing yourself or a microcontroller, but then you are using other off-the-shelf devices. Usually you need things like a Verilog netlist and you need some form of footprint model of the other die, which is normally supplied as GDS for the physical footprint. You probably get a SPICE netlist or a Verilog top level netlist, but you have to stitch all of that together and make sure that you have actually stitched it all together correctly.”

Further, a layout versus schematic validation must be done for this physical digital model against the logical models of the individual devices that are pulled together.

Once a verified model has been created and verified to be logically correct, it’s possible to start making tradeoffs about how dies should be packaged—side by side, vertically, what type of package and interconnect should be used.

“Some people try to use schematic capture,” said Felton. “They try to draw symbols to represent the die and they try to import the Verilog netlist to be the body of the die. Then they connect external ports onto that. You’re bringing about another level of risk of correctly connecting something incorrectly. It is extremely risky. The benefit might be that you can pictorially see this thing as a schematic, which can be very useful for engineers. But the risk of pulling together the disparate pieces of data, connecting them correctly and then driving them forward from there has a huge amount of risk. Very often they find that when they get into physical design they have an LVS error, so they have to go back and try to debug exactly where that error comes from. Was it an incorrect Verilog netlist that came in originally, or did the Verilog ports get mapped incorrectly when the schematic was built? People have tried to use schematics but it doesn’t provide any physical kind of understanding as to how the thing is going to look in 3-D because it is flat and static.”

The human factor
Convincing design engineers to use a new methodology isn’t easy, either.

“If you looked at our customers today, I would say 30% are trying to [use a new methodology] but it’s a fundamental change in design process and design flows,” Felton noted. “First of all, you’ve got to get these designers to think that way, and that can take time. They know how to do something already. They know it’s not ideal, but it works. And they know that anytime they have to change, it just brings more risk, so they push back until they dial it in or have a catastrophic failure.”

It’s also the case that design and packaging teams often work for different levels of management. Often, teams are artificially pulled into one group, which doesn’t always work well because the people in that group frequently are geographically dispersed around the globe. The key here is being able to pass design changes and models back and forth securely.

The way that a lot of companies will be able to adopt this kind of methodology is not by forcing their teams to cut locate or be in the same structural hierarchy of management but by being able to define a much better, more intelligent data exchange process between them. And this really comes down to product lifecycle management of all the data.

“On the IC side on the SoC design spectrum, we like to say it’s one big happy family, said John Ferguson, technical marketing engineer at Mentor. “But in reality it is not. There are a lot of communication issues there. We’ve got some design teams that will be working on different blocks of IP, and at the same time somebody else is trying to work on how those IPs get connected together into the SoC. And even with the best of intentions, it never goes 100% right, largely because they are autonomous teams. They kind of come together in the beginning, they work out of spec, everybody seems to be on board and thinks they are doing what they agreed to. But then you find out after the fact that there was some miscommunication. I don’t think it will ever really change. There will always be a discontinuity between who does what, who’s responsible for what, and how was it communicated.”


Fig. 3: Existing flow for advanced packaging. Source: Cadence

And it gets even harder as advanced packaging progresses.

“The bigger the project gets the harder it is to have people working together,” said Geoff Tate, CEO of Flex Logix. “A single chip design team can be hundreds of people, and the packaging people are probably on a different floor or in another building—or in another city. So the ability to work closely is going down as everything gets so big and so specialized. There’s always a benefit from functional teams working close together. It’s just hard to do because the teams get so big. When you have a building full of 500 people, you can’t have them all close together. Some of them are close together and the others are multiple floors apart. It’s just an organizational challenge you have with big teams.”

Still, change may be coming. John Park, product management director for IC packaging and cross-platform solutions at Cadence, said chip design teams, package design teams and even board design teams have started to work more closely. He said this has been true at large semiconductor companies for the past seven or eight years.

What they have mostly been doing up until the very recent past is working together without EDA tools, sharing Microsoft Visio drawings, PowerPoint slides, Excel spreadsheets, emails and whiteboard drawings in informal co-design flows.

“But at least they recognize the fact that they can’t just design a chip, throw it over the wall to the package design team and have them deal with any issues and then throw that over the wall to the board design team,” Park said “All the big boys have realized that doesn’t fly anymore. In fact, Intel did a paper at a conference about seven or eight years ago that said they were using that approach, and it led to their package cost being higher than their chip cost. Obviously that killed the project. Intel has been one of the early adopters of this trend, which many in the industry call ‘pathfinding,’ whereby the chip people and the package people and even board people starting to work early in the process on the chip, simultaneously planning out what package technology should be targeted.”

Looking ahead, the interest in and adoption of 3D stacking certainly will add value to the semiconductor system equation and provide opportunities for EDA innovation including the ability to do intelligent partitioning across the 3D stack, Park said. “It’s not just for planning a single chip at a single technology. It’s being able to do some intelligent floorplanning of three chips in a stack to figure out which block should be located on which chip in the stack, based on how much heat it generates, on electrical performance, etc. That’s an area where there is value. It’s almost like bringing in an RTL description of the chip, but being able to target two chips or three chips that reside in each vertical stack.”

Related Stories
Advanced Packaging Moves To Cars
Supply chain shake-up as OEMs look to fan-outs and systems in package for differentiation and faster time to market.
Advanced Packaging Picks Up Steam
System-in-package technology is poised to roll out across multiple new markets.
What’s Missing In Advanced Packaging
When it comes to multi-board and multi-chips-on-a-board designs, do engineers have all the tools they need?



Leave a Reply


(Note: This name will be displayed publicly)