7
0

SecEmb: Sparsity-Aware Secure Federated Learning of On-Device Recommender System with Large Embedding

Abstract

Federated recommender system (FedRec) has emerged as a solution to protect user data through collaborative training techniques. A typical FedRec involves transmitting the full model and entire weight updates between edge devices and the server, causing significant burdens to devices with limited bandwidth and computational power. While the sparsity of embedding updates provides opportunity for payload optimization, existing sparsity-aware federated protocols generally sacrifice privacy for efficiency. A key challenge in designing a secure sparsity-aware efficient protocol is to protect the rated item indices from the server. In this paper, we propose a lossless secure recommender systems on sparse embedding updates (SecEmb). SecEmb reduces user payload while ensuring that the server learns no information about both rated item indices and individual updates except the aggregated model. The protocol consists of two correlated modules: (1) a privacy-preserving embedding retrieval module that allows users to download relevant embeddings from the server, and (2) an update aggregation module that securely aggregates updates at the server. Empirical analysis demonstrates that SecEmb reduces both download and upload communication costs by up to 90x and decreases user-side computation time by up to 70x compared with secure FedRec protocols. Additionally, it offers non-negligible utility advantages compared with lossy message compression methods.

View on arXiv
@article{mai2025_2505.12453,
  title={ SecEmb: Sparsity-Aware Secure Federated Learning of On-Device Recommender System with Large Embedding },
  author={ Peihua Mai and Youlong Ding and Ziyan Lyu and Minxin Du and Yan Pang },
  journal={arXiv preprint arXiv:2505.12453},
  year={ 2025 }
}
Comments on this paper