The Implications Of AI Everywhere: From Data Center To Edge


Generative AI has upped the ante on the transformative force of AI, driving profound implications across all aspects of our everyday lives. Over the past year, we have seen AI capabilities placed firmly in the hands of consumers. The recent news and product announcements emerging from MWC 2024 highlighted what we can expect to see from the next wave of generative AI applications. AI will be eve... » read more

Memory’s Future Hinges On Reliability


Experts at the Table: Semiconductor Engineering sat down to talk about the impact of power and heat on off-chip memory, and what can be done to optimize performance, with Frank Ferro, group director, product management at Cadence; Steven Woo, fellow and distinguished inventor at Rambus; Jongsin Yun, memory technologist at Siemens EDA; Randy White, memory solutions program manager at Keysight; a... » read more

DRAM Test And Inspection Just Gets Tougher


DRAM manufacturers continue to demand cost-effective solutions for screening and process improvement amid growing concerns over defects and process variability, but meeting that demand is becoming much more difficult with the rollout of faster interfaces and multi-chip packages. DRAM plays a key role in a wide variety of electronic devices, from phones and PCs to ECUs in cars and servers ins... » read more

Memory Technologies Key To Advancing AI Applications


Memory is an integral component in every computer system, from the smartphones in our pockets to the giant data centers powering the world’s leading-edge AI applications. As AI continues to rise in reach and complexity, the demand for more memory from data center to endpoints is reshaping the industry’s requirements and traditional approaches to memory architectures. According to OpenAI,... » read more

GDDR6 Delivers The Performance For AI/ML Inference


AI/ML is evolving at a lightning pace. Not a week goes by right now without some new and exciting developments in the field, and applications like ChatGPT have brought generative AI capabilities firmly to the forefront of public attention. AI/ML is really two applications: training and inference. Each relies on memory performance, and each has a unique set of requirements that drive the choi... » read more

From Data Center To End Device: AI/ML Inference With GDDR6


Created to support 3D gaming on consoles and PCs, GDDR packs performance that makes it an ideal solution for AI/ML inference. As inference migrates from the heart of the data center to the network edge, and ultimately to a broad range of AI-powered IoT devices, GDDR memory’s combination of high bandwidth, low latency, power efficiency and suitability for high-volume applications will be incre... » read more

Choosing The Correct High-Bandwidth Memory


The number of options for how to build high-performance chips is growing, but the choices for attached memory have barely budged. To achieve maximum performance in automotive, consumer, and hyperscale computing, the choices come down to one or more flavors of DRAM, and the biggest tradeoff is cost versus speed. DRAM remains an essential component in any of these architectures, despite years ... » read more

Choosing The Right Memory At The Edge


As the amount of data produced by sensors in cars and phones continues to grow, more of that data needs to be processed locally. It takes too much time and power to send it all to the cloud. But choosing the right memory for a particular application requires a series of tradeoffs involving cost, bandwidth, power, which can vary greatly by device, application, and even the data itself. Frank Fer... » read more

How Memory Design Optimizes System Performance


Exponential increases in data and demand for improved performance to process that data has spawned a variety of new approaches to processor design and packaging, but it also is driving big changes on the memory side. While the underlying technology still looks very familiar, the real shift is in the way those memories are connected to processing elements and various components within a syste... » read more

Overcoming Signal, Power, And Thermal Challenges Implementing GDDR6 Interfaces


Graphics processing units (GPUs) and graphics double data rate (GDDR) memory interfaces are essential to graphics cards, game consoles, high-performance computing (HPC), and machine learning applications. These interfaces enable data transfer speeds of over 665GB per second today and will continue to support well over a terabyte per second (TBps) in next-generation GDDR interfaces. Signal integ... » read more

← Older posts