High-bandwidth memory originally was conceived as a way to increase capacity in memory attached to a 2.5D package. It has since become a staple for all high-performance computing, in some cases replacing SRAM for L3 cache. Archana Cheruliyil, senior product marketing manager at Alphawave Semi, talks about how and where HBM is used today, how it will be used in the future, why it is essential for AI systems, and how the new HBM4 standard and custom HBM will impact PPA.
Leave a Reply