ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.09429
  4. Cited By
A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm

A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm

25 June 2018
Konstantin Mishchenko
F. Iutzeler
J. Malick
ArXivPDFHTML

Papers citing "A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm"

3 / 3 papers shown
Title
Optimization for Supervised Machine Learning: Randomized Algorithms for
  Data and Parameters
Optimization for Supervised Machine Learning: Randomized Algorithms for Data and Parameters
Filip Hanzely
34
0
0
26 Aug 2020
99% of Distributed Optimization is a Waste of Time: The Issue and How to
  Fix it
99% of Distributed Optimization is a Waste of Time: The Issue and How to Fix it
Konstantin Mishchenko
Filip Hanzely
Peter Richtárik
16
13
0
27 Jan 2019
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with
  Linear Convergence Rate
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
Aryan Mokhtari
Mert Gurbuzbalaban
Alejandro Ribeiro
27
36
0
01 Nov 2016
1