ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.02215
93
75
v1v2v3 (latest)

FEDNEST: Federated Bilevel, Minimax, and Compositional Optimization

4 May 2022
Davoud Ataee Tarzanagh
Mingchen Li
Christos Thrampoulidis
Samet Oymak
    FedML
ArXiv (abs)PDFHTMLGithub (16★)
Abstract

Standard federated optimization methods successfully apply to stochastic problems with \textit{single-level} structure. However, many contemporary ML problems -- including adversarial robustness, hyperparameter tuning, and actor-critic -- fall under nested bilevel programming that subsumes minimax and compositional optimization. In this work, we propose FedNest: A federated alternating stochastic gradient method to address general nested problems. We establish provable convergence rates for FedNest in the presence of heterogeneous data and introduce variations for bilevel, minimax, and compositional optimization. FedNest introduces multiple innovations including federated hypergradient computation and variance reduction to address inner-level heterogeneity. We complement our theory with experiments on hyperparameter \& hyper-representation learning and minimax optimization that demonstrate the benefits of our method in practice. Code is available at https://github.com/mc-nya/FedNest.

View on arXiv
Comments on this paper