Top Tech Videos Of 2024


In 2024, hot topics included challenges involving chiplets and heterogeneous integration, AI, data management, MCUs, power semis, software-defined vehicles, sensors, adaptive test, yield tracking, safety monitoring, security, and much more. Top 5 most watched videos in 2024: Overlay Optimization In Advanced IC Substrates How To Stop Row Hammer Attacks What’s Changing In DRAM ... » read more

Design And Verification Issues In 2024


At the end of each year, I look back over the stories published and those that top the charts in terms of readership. I concentrate on those stories that are about the EDA tools and flows and the factors that are influencing them. These are good indicators of the problems designers and verification teams are facing today, and where they are looking for answers. This year's leading categories... » read more

Baby Steps Toward 3D DRAM


Flash memory has made incredible capacity strides thanks to monolithic 3D processing enabled by the stacking of more than 200 layers, which is on its way to 1.000 layers in future generations.[1] But the equally important DRAM has achieved a similar manufacturable 3D architecture. The need for a sufficiently large means of storing charge — such as a capacitor — has proved elusive. Severa... » read more

Is In-Memory Compute Still Alive?


In-memory computing (IMC) has had a rough go, with the most visible attempt at commercialization falling short. And while some companies have pivoted to digital and others have outright abandoned the technology, developers are still trying to make analog IMC a success. There is disagreement regarding the benefits of IMC (also called compute-in-memory, or CIM). Some say it’s all about reduc... » read more

Redefining XPU Memory For AI Data Centers Through Custom HBM4: Part 1


This is the first of a three-part series on HBM4 and gives an overview of the HBM standard. Part 2 will provide insights on HBM implementation challenges, and part 3 will introduce the concept of a custom HBM implementation. Relentless growth in data consumption Recent advances in deep learning have had a transformative effect on artificial intelligence (AI) and the ever-increasing volume of ... » read more

HBM Options Increase As AI Demand Soars


High-bandwidth memory (HBM) sales are spiking as the amount of data that needs to be processed quickly by state-of-the-art AI accelerators, graphic processing units, and high-performance computing applications continues to explode. HBM inventories are sold out, driven by massive efforts and investments in developing and improving large language models such as ChatGPT. HBM is the memory of ch... » read more

Extending The DDR5 Roadmap With MRDIMM


Given the voracious memory bandwidth and capacity demands of Gen AI and other advanced workloads, we’ve seen a rapid progression through the generations of DDR5 memory. Multiplexed Registered DIMMs (MRDIMMs) offer a new memory module architecture capable of extending the DDR5 roadmap and expanding the capabilities of server main memory. MRDIMM reuses the lion’s share of existing DDR5 infras... » read more

GDDR7 Memory Supercharges AI Inference


GDDR7 is the state-of-the-art graphics memory solution with a performance roadmap of up to 48 Gigatransfers per second (GT/s) and memory throughput of 192 GB/s per GDDR7 memory device. The next generation of GPUs and accelerators for AI inference will use GDDR7 memory to provide the memory bandwidth needed for these demanding workloads. AI is two applications: training and inference. With tr... » read more

Research Bits: Sept. 24


Modeling negative capacitance Researchers from Lawrence Berkeley National Laboratory developed an open-source 3D simulation framework capable of modeling the atomistic origins of negative capacitance in ferroelectric thin films at the device level. When a material has negative capacitance, it can store a greater amount of electrical charge at lower voltages. The team believes the FerroX fra... » read more

HBM4 Feeds Generative AI’s Hunger For More Memory Bandwidth


Generative AI (Gen AI), built on the exponential growth of Large Language Models (LLMs) and their kin, is one of today’s biggest drivers of computing technology. Leading-edge LLMs now exceed a trillion parameters and offer multimodal capabilities so they can take a broad range of inputs, whether they’re in the form of text, speech, images, video, code, and more, and generate an equally broa... » read more

← Older posts