7
0

Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability

Main:8 Pages
6 Figures
Bibliography:3 Pages
1 Tables
Appendix:17 Pages
Abstract

Existing analysis of Local (Stochastic) Gradient Descent for heterogeneous objectives requires stepsizes η1/K\eta \leq 1/K where KK is the communication interval, which ensures monotonic decrease of the objective. In contrast, we analyze Local Gradient Descent for logistic regression with separable, heterogeneous data using any stepsize η>0\eta > 0. With RR communication rounds and MM clients, we show convergence at a rate O(1/ηKR)\mathcal{O}(1/\eta K R) after an initial unstable phase lasting for O~(ηKM)\widetilde{\mathcal{O}}(\eta K M) rounds. This improves upon the existing O(1/R)\mathcal{O}(1/R) rate for general smooth, convex objectives. Our analysis parallels the single machine analysis of~\cite{wu2024large} in which instability is caused by extremely large stepsizes, but in our setting another source of instability is large local updates with heterogeneous objectives.

View on arXiv
@article{crawshaw2025_2506.13974,
  title={ Constant Stepsize Local GD for Logistic Regression: Acceleration by Instability },
  author={ Michael Crawshaw and Blake Woodworth and Mingrui Liu },
  journal={arXiv preprint arXiv:2506.13974},
  year={ 2025 }
}
Comments on this paper