Verifying A DDR5 Memory Subsystem

Ensuring memory works as expected in the context of a complex design, and over time.


With the increasing complexity of DDR memory models and a vast set of configurations, it has become a daunting experience for verification engineers to verify memory subsystems. With the help of DDR5 Questa VIP and its unique features, engineers can maximize their debugging capabilities and achieve their verification goals quickly and efficiently. This paper introduces the Siemens EDA DDR5 and DDR5 DIMM Questa VIP (QVIP) components and their unique features that help in verifying DDR5 memory subsystems efficiently.

Why verify DDR5 memory subsystem?
The latest technologies and applications often demand more speed and performance. With the advancement in technologies such as multi-core CPUs and GPUs, the need for faster data processing is becoming a bottleneck for system performance. Applications such as machine learning and data centers rely upon high performance and lower latency. These applications need a memory that can offer high speed, better performance, high density, lower latency, and data integrity. Looking at the memory trends: DDR4 came out in 2014, starting out at 1600 MHz speeds which eventually reached 3200 MHz at the very high end. At the time, that was enough. Fast forward to 2020, as the processing demand from the latest applications increased, suddenly 3200 MHz memory isn’t enough. And with that, datasets for AI and machine learning growing ever larger, memory was becoming a real bottleneck. To address this issue, JEDEC (Joint Electron Device Engineering Council) developed the DDR5 standard. DDR5 starts at 3200 MHz and goes up to 6400. Architects project it may even go up to 8400 MHz as time goes on. DDR5 offers up to 4 times higher storage capacity in a single-die package when compared with DDR4. It offers On-die ECC, this is used in servers for data integrity checks.

DRAM memory is expected to stay parallel for the next generations of memories. Memory interfaces see changes over their product lifetime such as constantly changing protocol specifications with complex features. For example, 5G is developing quickly and its driving growth in a variety of exciting technologies – from cloud computing to artificial intelligence to IoT. All that data has to be accessed and stored somewhere. Also, that data has to be accessed faster than ever, meaning faster memories like DDR5 have never been more important. DDR5 improves channel efficiency, density, and bandwidth, but faster signal speeds and higher data rates mean complex designs that push the boundaries of signal integrity challenges. This also requires higher performance and lower voltage measurements for validation, compliance and debugging. DDR memory validation involves testing complex features to ensure stable operation over the product lifetime.

High complexity and the need for verification
With advanced and complex features, there is a need for meticulous verification. Memories have a vast set of configurations that allow them to operate at various data rates with different densities. Further, these can be combined with a vast set of features such as self-refresh, auto refresh, cyclic redundancy check (CRC), post-package repair, maximum power saving mode (MPSM), training across different settings of latencies and speeds. The permutation and combinations of these variables can grow exponentially across different memory vendors as each of these memory vendors offer 100s of part numbers. As you can imagine, it can easily become a daunting experience for verification engineers to verify a memory subsystem. DDR5 DIMMs adds more challenges. For example, in order to achieve higher power efficiency, voltage is reduced from 1.2V to 1.1V which brings additional complexity for DIMM vendors around noise immunity. Higher speeds raise data integrity concerns which need precise training results. This creates the need for careful verification, as modeling of real-world scenarios and visualization of those scenarios going into wire level toggling would consume a lot of time.

Measuring verification progress through functional coverage is equally important as the verification of design features. Also, performance analysis is extremely important for memories. These add a few extra cycles in the verification flow.

To read more, click here.

Leave a Reply

(Note: This name will be displayed publicly)