ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.02623
  4. Cited By
FLuID: Mitigating Stragglers in Federated Learning using Invariant
  Dropout

FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout

5 July 2023
Irene Wang
Prashant J. Nair
Divyat Mahajan
ArXivPDFHTML

Papers citing "FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout"

3 / 3 papers shown
Title
Workload-Aware Hardware Accelerator Mining for Distributed Deep Learning
  Training
Workload-Aware Hardware Accelerator Mining for Distributed Deep Learning Training
Muhammad Adnan
Amar Phanishayee
Janardhan Kulkarni
Prashant J. Nair
Divyat Mahajan
50
0
0
23 Apr 2024
Federated Learning Challenges and Opportunities: An Outlook
Federated Learning Challenges and Opportunities: An Outlook
Jie Ding
Eric W. Tramel
Anit Kumar Sahu
Shuang Wu
Salman Avestimehr
Tao Zhang
FedML
52
56
0
01 Feb 2022
FjORD: Fair and Accurate Federated Learning under heterogeneous targets
  with Ordered Dropout
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
Samuel Horváth
Stefanos Laskaridis
Mario Almeida
Ilias Leondiadis
Stylianos I. Venieris
Nicholas D. Lane
189
268
0
26 Feb 2021
1