A new technical paper titled “Scaling Intelligence: Designing Data Centers for Next-Gen Language Models” was published by Intel Corporation and Georgia Tech.
An excerpt from the paper’s abstract:
“Our work provides a comprehensive co-design framework that jointly explores FLOPS, HBM bandwidth and capacity, multiple network topologies (two-tier vs. FullFlat optical), the size of the scale-out domain, and popular parallelism/optimization strategies used in LLMs. We introduce and evaluate FullFlat network architectures, which provide uniform high-bandwidth, low-latency connectivity between all nodes, and demonstrate their transformative impact on performance and scalability. “
Find the technical paper here. June 2025.
Tithi, Jesmin Jahan, Hanjiang Wu, Avishaii Abuhatzera, and Fabrizio Petrini. “Scaling Intelligence: Designing Data Centers for Next-Gen Language Models.” arXiv preprint arXiv:2506.15006 (2025).
Leave a Reply