Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.05800
Cited By
v1
v2 (latest)
SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing
11 February 2022
Nicolò Dal Fabbro
S. Dey
M. Rossi
Luca Schenato
FedML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing"
9 / 9 papers shown
Title
Accelerated Training of Federated Learning via Second-Order Methods
Mrinmay Sen
Sidhant R Nair
C Krishna Mohan
FedML
52
0
0
29 May 2025
FLeNS: Federated Learning with Enhanced Nesterov-Newton Sketch
Sunny Gupta
Mohit Jindal
Pankhi Kashyap
Pranav Jeevan
Amit Sethi
FedML
75
0
0
23 Sep 2024
Federated Cubic Regularized Newton Learning with Sparsification-amplified Differential Privacy
Wei Huo
Changxin Liu
Kemi Ding
Karl H. Johansson
Ling Shi
FedML
99
0
0
08 Aug 2024
FedNS: A Fast Sketching Newton-Type Algorithm for Federated Learning
Jian Li
Yong Liu
Wei Wang
Haoran Wu
Weiping Wang
FedML
83
3
0
05 Jan 2024
VREM-FL: Mobility-Aware Computation-Scheduling Co-Design for Vehicular Federated Learning
Luca Ballotta
Nicolò Dal Fabbro
Giovanni Perin
Luca Schenato
Michele Rossi
Giuseppe Piro
100
1
0
30 Nov 2023
Distributed Adaptive Greedy Quasi-Newton Methods with Explicit Non-asymptotic Convergence Bounds
Yubo Du
Keyou You
88
5
0
30 Nov 2023
FedZeN: Towards superlinear zeroth-order federated learning via incremental Hessian estimation
A. Maritan
S. Dey
Luca Schenato
FedML
90
6
0
29 Sep 2023
Q-SHED: Distributed Optimization at the Edge via Hessian Eigenvectors Quantization
Nicolò Dal Fabbro
M. Rossi
Luca Schenato
S. Dey
53
0
0
18 May 2023
Distributed Newton-Type Methods with Communication Compression and Bernoulli Aggregation
Rustem Islamov
Xun Qian
Slavomír Hanzely
M. Safaryan
Peter Richtárik
79
16
0
07 Jun 2022
1