ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.06593
  4. Cited By
A Distributed Newton Method for Large Scale Consensus Optimization

A Distributed Newton Method for Large Scale Consensus Optimization

21 June 2016
Rasul Tutunov
Haitham Bou-Ammar
Ali Jadbabaie
ArXivPDFHTML

Papers citing "A Distributed Newton Method for Large Scale Consensus Optimization"

6 / 6 papers shown
Title
Newton Method over Networks is Fast up to the Statistical Precision
Newton Method over Networks is Fast up to the Statistical Precision
Amir Daneshmand
G. Scutari
Pavel Dvurechensky
Alexander Gasnikov
30
22
0
12 Feb 2021
Efficient Semi-Implicit Variational Inference
Efficient Semi-Implicit Variational Inference
Vincent Moens
Hang Ren
A. Maraval
Rasul Tutunov
Jun Wang
H. Ammar
85
6
0
15 Jan 2021
Compositional ADAM: An Adaptive Compositional Solver
Compositional ADAM: An Adaptive Compositional Solver
Rasul Tutunov
Minne Li
Alexander I. Cowen-Rivers
Jun Wang
Haitham Bou-Ammar
ODL
57
16
0
10 Feb 2020
DINGO: Distributed Newton-Type Method for Gradient-Norm Optimization
DINGO: Distributed Newton-Type Method for Gradient-Norm Optimization
Rixon Crane
Fred Roosta
8
43
0
16 Jan 2019
Optimal algorithms for smooth and strongly convex distributed
  optimization in networks
Optimal algorithms for smooth and strongly convex distributed optimization in networks
Kevin Scaman
Francis R. Bach
Sébastien Bubeck
Y. Lee
Laurent Massoulié
18
325
0
28 Feb 2017
Learning Task Grouping and Overlap in Multi-task Learning
Learning Task Grouping and Overlap in Multi-task Learning
Abhishek Kumar
Hal Daumé
184
525
0
27 Jun 2012
1