ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.14824
7
0

FedNano: Toward Lightweight Federated Tuning for Pretrained Multimodal Large Language Models

12 June 2025
Y. Zhang
Hewei Gao
Haokun Chen
Weiguo Li
Yunpu Ma
Volker Tresp
ArXiv (abs)PDFHTML
Main:8 Pages
3 Figures
Bibliography:3 Pages
7 Tables
Appendix:1 Pages
Abstract

Multimodal Large Language Models (MLLMs) excel in tasks like multimodal reasoning and cross-modal retrieval but face deployment challenges in real-world scenarios due to distributed multimodal data and strict privacy requirements. Federated Learning (FL) offers a solution by enabling collaborative model training without centralizing data. However, realizing FL for MLLMs presents significant challenges, including high computational demands, limited client capacity, substantial communication costs, and heterogeneous client data. Existing FL methods assume client-side deployment of full models, an assumption that breaks down for large-scale MLLMs due to their massive size and communication demands. To address these limitations, we propose FedNano, the first FL framework that centralizes the LLM on the server while introducing NanoEdge, a lightweight module for client-specific adaptation. NanoEdge employs modality-specific encoders, connectors, and trainable NanoAdapters with low-rank adaptation. This design eliminates the need to deploy LLM on clients, reducing client-side storage by 95%, and limiting communication overhead to only 0.01% of the model parameters. By transmitting only compact NanoAdapter updates, FedNano handles heterogeneous client data and resource constraints while preserving privacy. Experiments demonstrate that FedNano outperforms prior FL baselines, bridging the gap between MLLM scale and FL feasibility, and enabling scalable, decentralized multimodal AI systems.

View on arXiv
@article{zhang2025_2506.14824,
  title={ FedNano: Toward Lightweight Federated Tuning for Pretrained Multimodal Large Language Models },
  author={ Yao Zhang and Hewei Gao and Haokun Chen and Weiguo Li and Yunpu Ma and Volker Tresp },
  journal={arXiv preprint arXiv:2506.14824},
  year={ 2025 }
}
Comments on this paper