691

Think Locally, Act Globally: Federated Learning with Local and Global Representations

Abstract

Federated learning is an emerging research paradigm to train models on private data distributed over multiple devices. A key challenge involves keeping device data private and training a global model only by communicating parameters and updates. Given the recent trend towards building larger models, deploying models in federated settings on real-world tasks is becoming increasingly difficult. To this end, we propose to augment federated learning with local representation learning on each device to learn useful and compact representations from raw data. As a result, the global model can be smaller since it only operates on local representations, reducing the number of communicated parameters. In addition, we show that local models provide flexibility in dealing with heterogeneous data and can be modified to learn fair representations that obfuscate protected attributes such as race, age, and gender. Finally, we support our empirical results with a theoretical analysis which shows that a combination of local and global models reduces both variance in the data as well as variance in data distributions across the devices.

View on arXiv
Comments on this paper