ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.09690
  4. Cited By
Private Heterogeneous Federated Learning Without a Trusted Server
  Revisited: Error-Optimal and Communication-Efficient Algorithms for Convex
  Losses

Private Heterogeneous Federated Learning Without a Trusted Server Revisited: Error-Optimal and Communication-Efficient Algorithms for Convex Losses

12 July 2024
Changyu Gao
Andrew Lowy
Xingyu Zhou
Stephen J. Wright
    FedML
ArXivPDFHTML

Papers citing "Private Heterogeneous Federated Learning Without a Trusted Server Revisited: Error-Optimal and Communication-Efficient Algorithms for Convex Losses"

1 / 1 papers shown
Title
Private Stochastic Optimization With Large Worst-Case Lipschitz
  Parameter: Optimal Rates for (Non-Smooth) Convex Losses and Extension to
  Non-Convex Losses
Private Stochastic Optimization With Large Worst-Case Lipschitz Parameter: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex Losses
Andrew Lowy
Meisam Razaviyayn
30
13
0
15 Sep 2022
1