Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability

Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes where is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize . With communication rounds and clients, we show convergence at a rate after an initial unstable phase lasting for rounds. This improves upon the existing rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of~\cite{wu2024large} in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.
View on arXiv@article{crawshaw2025_2506.13974, title={ Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability }, author={ Michael Crawshaw and Blake Woodworth and Mingrui Liu }, journal={arXiv preprint arXiv:2506.13974}, year={ 2025 } }