Knowledge Center
Navigation
Knowledge Center

Data Centers

A data center is a physical building or room that houses multiple servers with CPUs for remote data storage and processing.
popularity

Description

A data center is a physical building complex or even just a single room that houses servers, switches, CPUs, memory, routers, cooling systems, and a lot of high-speed cabling for the purpose of remote data storage and computing. The servers are networked together in the data center. A data center may be part of a private or public cloud and are the fundamental building blocks of cloud system: users access the data center through Internet connections or other network connections.

Scalability is a major challenge today for data centers, says Synopsys‘ Rita Horner in a video. Data centers are physical infrastructure: they can be enlarged with more powerful upgraded equipment and more floor space. However, enlarging them is capital expensive and space can run out.

Private data centers grew out of the need for the semiconductor manufacturers and other industries to consolidate their server farms. Server farms, which used to be maintained at each design center,  consolidated into a limited number of data centers, that could still be distributed worldwide. For example, a company that had 20 design centers across America and Europe may have found it more cost effective to consolidate all their servers into four data centers, geographically distributed across the two continents. The consolidation reduced IT maintenance costs and improved utilization.

Hyperscale data centers are very large data centers run by large companies that produce huge amounts of data and also sell space in their data centers — Google, Microsoft, IBM, Facebook, for example.  According to Synergy Research Group (in July 2023) hyperscale providers are operating almost 900 large data centers, which is 37% of data center’s capacity worldwide. The capacity of non-hyperscale colocation data centers (where companies are sharing a facility) accounts for 23% of capacity and on-premise data centers are 40%.1

The main semiconductor functions in data center are carried out by servers, which are built of memory and computing ICs (handling workloads) and the high-speed infrastructure that connects them, moving the data on chip. Now generative AI has significantly increased the amount of data that needs to be processed, resulting in higher utilization of processors and memories.

Data centers are also undergoing a fundamental shift to boost server utilization and improve efficiency, optimizing architectures so available compute resources can be leveraged wherever they are needed. The amount of data that needs to be processed is accelerating. Data centers are well aware of this pending data deluge.

Traditionally, data centers were built with racks of servers, each server providing computing, memory, interconnect, and possibly acceleration resources. But when a server is selected, some of those resources go unused, despite being needed somewhere else in the data center. With the current model, existing resources cannot be leveraged because the server blade is the basic unit of partition.

This has led to a complete reorganization in the hyperscale data centers to use compute resources more efficiently, and the idea now is beginning to percolate through other data centers.

References:
Synergy Research Group, “On-Premise Data Center Capacity Being Increasingly Dwarfed by Hyperscalers and Colocation Companies,” July 2023.

 

Multimedia

Next-Gen High-Speed Communication In Data Centers

Multimedia

Reducing Power In Data Centers

Multimedia

Very Short Reach SerDes In Data Centers

Multimedia

Silent Data Corruption

Multimedia

Co Packaged Optics In The Data Center

Multimedia

New Challenges For Data Centers