The Data Center In 2018 And Beyond

Memory systems play a central role in enabling new technologies.


As computing continues to evolve, a number of trends are continuing to challenge the design of conventional von Neumann architectures, and in turn are driving the development of new architectural approaches and technologies. These include the growing adoption of artificial intelligence (AI), machine learning, AR/VR, IoT, high-speed financial transactions, self-driving vehicles, and blockchain/cryptocurrency mining.

In 2018, we expect to see the continued deployment of FPGAs, GPUs and specialized silicon in the data center to address the needs of these applications. In addition, we anticipate the industry will focus increasing attention and resources on Post Moore-Era technologies such as cryogenic and quantum computing. Put simply, we expect continued industry attention to be focused on architectures and accelerators that address modern bottlenecks as the end of Moore’s Law looms large on the horizon.

While the industry ramps up its focus on Post Moore-Era solutions, there is also continued focus on advancing conventional architectures. Modern data center applications continue to drive the need for increasing memory bandwidth and capacity that is not only accelerating the evolution and deployment of faster DDR memory, but also providing opportunities for high-performance memories such as HBM and GDDR. In the near-term, DDR4 buffer chip adoption will continue to ramp as the industry collectively awaits the launch and deployment of DDR5. According to JEDEC, DDR5 memory will offer improved performance with greater power efficiency compared to previous generations of DDR. As planned, DDR5 will provide double the bandwidth and density over DDR4, along with delivering improved channel efficiency.

In addition, the industry continues to explore the most effective ways of deploying non-volatile memories and upcoming storage class memories in its relentless effort to improve performance, power-efficiency, and cost. Hybrid DIMM technologies such as NVDIMM-P are expected to enable new memory solutions optimized for cost, energy consumption, and performance. NVDIMM-P is a new high capacity persistent memory module for computing systems that seeks to bring larger amounts of data closer to the processor.

NVDIMMs offer persistence, which can improve fault-tolerance and data integrity, while also potentially optimizing the performance of the memory hierarchy. Tasks where this technology shows promise include indexing, message queuing, logging, batch processing, on-line transactions, and storage applications. NVDIMMs can potentially benefit huge in-memory computing tasks such as in-memory databases that are integral parts of search engines and hyper-scale computing applications.

For many IoT applications, we expect to see fog and edge computing gaining mindshare in 2018 and beyond. These paradigms take a different approach to computing on large quantities of data by moving the processing closer to the data rather than the conventional method of moving data towards more centralized data centers. By moving as much computing as is practical to the edge of the network (and closer to the devices that are generating data), problems associated with moving large amounts of data to centralized data centers that consume precious network bandwidth can be avoided, improving performance, cost, and power efficiency.

In conclusion, the recent shifts toward more data-centric computing have driven the adoption of technologies that improve the memory hierarchy and that alleviate data movement bottlenecks. In 2018 and beyond, will see a continued focus on memory hierarchies, as advances in computing are increasingly limited by the ability of memory systems to keep processing units fed with data. The potential for large-scale architecture changes and the increasing adoption of newer, non-von Neumann architectures are generating excitement in the industry, with memory systems once again taking center stage as a critical area of innovation.

Leave a Reply

(Note: This name will be displayed publicly)