A new technical paper titled "Hecaton: Training and Finetuning Large Language Models with Scalable Chiplet Systems" was published by researchers at Tsinghua University.
Abstract
"Large Language Models (LLMs) have achieved remarkable success in various fields, but their training and finetuning require massive computation and memory, necessitating parallelism which introduces heavy communicat...
» read more