3D DRAM Makers Inch Closer To Production

Mass market for devices is still at least a year away, but big name-supporters and advantages in power and speed point to a big uptick; most technical hurdles are solved.

popularity

By Mark LaPedus
For some time, DRAM makers have been developing 3D memory chips, but commercial products still are not due out for some time because of technical and cost issues.

But the advent of the 3D DRAM era could be near the turning point, as two memory rivals have separately moved to bring their respective technologies closer to production. In one move, Micron Technology Inc. has disclosed the manufacturing flow for its recently announced Hybrid Memory Cube (HMC) technology, a 3D DRAM scheme geared for high-end servers and networking systems. Under the plan, IBM will manufacture the controller logic portions of the HMC within its own fab. Micron will make the memory portions, as well as assemble and test, the HMC devices within its own operations.

On another and more surprising front, Japanese DRAM maker Elpida Memory apparently has beat its larger rivals to the punch by announcing the industry’s first commercial Wide I/O DRAMs. The first device from Elpida, dubbed Wide IO Mobile RAM, is a 4 Gbit device based on a 30nm process technology and a 3D structure using through-silicon vias (TSVs). Elpida plans to sample its first Wide I/O DRAM devices this month. The devices are geared for next-generation smartphones and tablets.

Samsung Electronics Co. Ltd. and Hynix Semiconductor Inc. are also separately developing 3D DRAMs. The idea behind a 3D device is to stack existing die and connect them using TSVs, thereby lowering the resistivity and boosting the bandwidths. But the problems with 3D devices based on TSVs involve cost, technical issues and supply-chain headaches.

“There is a lot of attention and engineering resources being thrown at 3D right now by all DRAM developers, including Samsung, Micron, Elpida, and Hynix,” said Mike Howard, senior principal analyst for DRAM and memory at IHS iSuppli. “Wide I/O has yet to really reach a cost level that makes it competitive and we are likely still a few years away from mass adoption. Elpida may very well have a functioning part in the lab and may be able to produce test samples, but I think we’re still a few years away from this being used in anything but the most premium markets.”

Hank Lai, product planning for memory marketing at Samsung Semiconductor Inc., said Wide I/O DRAMs are not expected to gain traction until sometime in 2013. At present, smart phones and tablets are using plain-vanilla, low-power DDR3 DRAMs or mobile DRAMs based on the LPDDR2 interface standard. Before Wide I/O, the mobile market will move from LPDDR2 to the next-generation LPDDR3 interface standard, Lai said.
LPDDR2 has a maximum throughput of 8.5 Gbytes/second. LPDDR3 has a peak throughput of 12.8 Gbytes/second. Samsung claims its new LPDDR3 devices consume 20% less power than LPDDR2.

Elpida’s Wide IO Mobile RAM has 512 I/O pins. The device is said to achieve a data transfer rate of 12.8 Gbytes/second, roughly similar to LPDDR3. But Elpida’s Wide IO Mobile RAM has a height of 1.0mm, compared to 1.4mm with existing mobile DRAMs based on today’s package-on-package (PoP) technology.

Elpida acknowledged that the Wide I/O market will take time to evolve. The 4 Gbit Wide I/O DRAM will sample next month, but production “will take place sometime in the second half of 2012,” according to officials from Elpida. “For volume production, it will be sometime in 2013.”

In March of 2012, Elpida plans to sample a 16-Gbit DRAM, which is based on stacking four 4-Gbit Wide IO Mobile RAM chips. Mass production is due sometime after 2013, according to Elpida.

On the other end of the spectrum, Micron and Samsung are moving full speed ahead with HMC. “This is a slightly different product than Elpida’s and is targeted at server customers. The specs are very promising, but again, this is still a few years from hitting the big time—2013 at the soonest,” Howard said. “Samsung is also a part of the HMC group, lending weight to the product’s chances.”

In October, Samsung and Micron announced the creation of a consortium to develop an open interface specification for HMC. Micron is the actual designer of the HMC technology. Micron and Samsung, as well as Open-Silicon, Altera and Xilinx, are the founding members of the Hybrid Memory Cube Consortium (HMCC).

HMC will incorporate DRAM arrays stacked on a logic chip. The device is connected with 2,000 to 3,000 TSVs. HMC prototypes are said to clock in with bandwidth of 128 Gbytes/second.

It is not a widely known fact, but fabless ASIC house Open-Silicon is developing the controller IP for HMC. Colin Baldwin, director of marketing and business development for Open-Silicon, said the HMC controller will be based on the company’s Interlaken controller IP. Interlaken is a high-speed, chip-to-chip interface protocol that builds on the channelization and per-channel flow control features of SPI4.2. The Interlaken controller will serve as the interface between the memory and physical layer to help “boost the bandwidth” in the device, Baldwin said.

On the manufacturing front, the HMC device itself will go through a two-step process. The controller logic portion of HMC will be manufactured at IBM’s semiconductor fab in East Fishkill, N.Y., using the company’s 32nm, high-k metal gate process technology. IBM also will handle the TSV creation process based on Micron’s specifications.

Micron will develop and make the DRAM arrays in-house based on a 3xnm process within its own fabs, said Mike Black, a technology strategist at Micron. Micron will take the logic controller from IBM—and the in-house made memory arrays—and then will assemble and test the entire HMC device within Micron’s R&D production line in Boise, Ida, Black said.

Micron is in the qualification stage with the device. “We are feeling pretty good about it,” he said. “Most of the learning is done.”



Leave a Reply


(Note: This name will be displayed publicly)