58
0

DES-LOC: Desynced Low Communication Adaptive Optimizers for Training Foundation Models

Abstract

Scaling foundation model training with Distributed Data Parallel (DDP) methods is bandwidth-limited. Existing infrequent communication methods like Local SGD were designed to synchronize only model parameters and cannot be trivially applied to adaptive optimizers due to additional optimizer states. Current approaches extending Local SGD either lack convergence guarantees or require synchronizing all optimizer states, tripling communication costs. We propose Desynced Low Communication Adaptive Optimizers (DES-LOC), a family of optimizers assigning independent synchronization periods to parameters and momenta, enabling lower communication costs while preserving convergence. Through extensive experiments on language models of up to 1.7B, we show that DES-LOC can communicate 170x less than DDP and 2x less than the previous state-of-the-art Local ADAM. Furthermore, unlike previous heuristic approaches, DES-LOC is suited for practical training scenarios prone to system failures. DES-LOC offers a scalable, bandwidth-efficient, and fault-tolerant solution for foundation model training.

View on arXiv
@article{iacob2025_2505.22549,
  title={ DES-LOC: Desynced Low Communication Adaptive Optimizers for Training Foundation Models },
  author={ Alex Iacob and Lorenzo Sani and Mher Safaryan and Paris Giampouras and Samuel Horváth and Andrej Jovanovic and Meghdad Kurmanji and Preslav Aleksandrov and William F. Shen and Xinchi Qiu and Nicholas D. Lane },
  journal={arXiv preprint arXiv:2505.22549},
  year={ 2025 }
}
Comments on this paper