Knowledge Center
Navigation
Knowledge Center

High-Bandwidth Memory (HBM)

A dense, stacked version of memory with high-speed interfaces that can be used in advanced packaging.
popularity

Description

High-bandwidth memory (HBM) is standardized stacked memory technology that provides very wide channels for data, both within the stack and between the memory and logic.

An HBM stack can contain up to eight DRAM modules, which are connected by two channels per module. Current implementations include up to four chips, which is roughly the equivalent of 40 DDR cores in a fraction of the space.

What makes this technology attractive is bandwidth between the DRAM chips, between the module and logic (interposer technology), and the small form factor when compared to DRAM DIMMs.

Fig. 1: HBM stack for maximum data throughput. Source: Rambus

Fig. 1: HBM stack for maximum data throughput. Source: Rambus

JEDEC adopted the HBM standard in October 2013, and the HBM 2 standard in January 2016. Both Samsung and SK Hynix are now commercially producing HBM chips, with others expected to follow.

HBM3, announced in the summer of 2022, is a high-performance memory that features reduced power consumption and a small form factor. It combines 2.5D packaging with a wider interface at a lower clock speed (as compared to GDDR6) to deliver higher overall throughput at a higher bandwidth-per-watt efficiency for AI/ML and high-performance computing (HPC) applications.1

1. HBM3 definition courtesy of Rambus.


Multimedia

HBM3 In The Data Center

Multimedia

GDDR6 – HBM2 Tradeoffs

Multimedia

Building AI SoCs

Multimedia

Tech Talk: HBM vs. GDDR6

Multimedia

Using High-Bandwidth Memory

Multimedia

Tech Talk: 2.5D Issues


Related Entities