Personalized Decentralized Bilevel Optimization over Random Directed
Networks
- FedML
Personalization and decentralization are two major lines of studies to realize practical federated learning in the real world. The aim of this study is to establish a general and unified approach that can solve these two problems simultaneously. In this work, we first propose a bilevel problem that can adapt to various personalization scenarios by allowing an arbitrary choice of two parameters: a client-wise outer-parameter representing heterogeneity, and a shared inner-parameter representing homogeneity across client data distributions. We then present an algorithm that can solve this bilevel problem in a decentralized manner by estimating gradients of clients' outer-costs with respect to their outer-parameters. We show that the proposed algorithm can be extended to handle a random directed network, which is one of the most robust decentralized communication classes. The proposed method achieves state-of-the-art performance on a personalization benchmark across various communication settings.
View on arXiv