From Data Center To End Device: AI/ML Inferencing With GDDR6

How GDDR memory is ideally suited for the needs of AI/ML inferencing.


Created to support 3D gaming on consoles and PCs, GDDR packs performance that makes it an ideal solution for AI/ML inferencing. As inferencing migrates from the heart of the data center to the network edge, and ultimately to a broad range of AI-powered IoT devices, GDDR memory’s combination of high bandwidth, low latency, power efficiency and suitability for high-volume applications will be increasingly important. The latest iteration of the standard, GDDR6 memory, pushes data rates to 18 gigabits per second and device bandwidths to 72 gigabytes per second.

Click here to read more.

Leave a Reply

(Note: This name will be displayed publicly)