ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.10870
32
42

Decentralized Gossip-Based Stochastic Bilevel Optimization over Communication Networks

22 June 2022
Shuoguang Yang
Xuezhou Zhang
Mengdi Wang
ArXivPDFHTML
Abstract

Bilevel optimization have gained growing interests, with numerous applications found in meta learning, minimax games, reinforcement learning, and nested composition optimization. This paper studies the problem of distributed bilevel optimization over a network where agents can only communicate with neighbors, including examples from multi-task, multi-agent learning and federated learning. In this paper, we propose a gossip-based distributed bilevel learning algorithm that allows networked agents to solve both the inner and outer optimization problems in a single timescale and share information via network propagation. We show that our algorithm enjoys the O(1Kϵ2)\mathcal{O}(\frac{1}{K \epsilon^2})O(Kϵ21​) per-agent sample complexity for general nonconvex bilevel optimization and O(1Kϵ)\mathcal{O}(\frac{1}{K \epsilon})O(Kϵ1​) for strongly convex objective, achieving a speedup that scales linearly with the network size. The sample complexities are optimal in both ϵ\epsilonϵ and KKK. We test our algorithm on the examples of hyperparameter tuning and decentralized reinforcement learning. Simulated experiments confirmed that our algorithm achieves the state-of-the-art training efficiency and test accuracy.

View on arXiv
Comments on this paper