ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.07971
  4. Cited By
Convergence Analysis of Inexact Randomized Iterative Methods

Convergence Analysis of Inexact Randomized Iterative Methods

19 March 2019
Nicolas Loizou
Peter Richtárik
ArXivPDFHTML

Papers citing "Convergence Analysis of Inexact Randomized Iterative Methods"

8 / 8 papers shown
Title
Single-Call Stochastic Extragradient Methods for Structured Non-monotone
  Variational Inequalities: Improved Analysis under Weaker Conditions
Single-Call Stochastic Extragradient Methods for Structured Non-monotone Variational Inequalities: Improved Analysis under Weaker Conditions
S. Choudhury
Eduard A. Gorbunov
Nicolas Loizou
25
13
0
27 Feb 2023
Deep Unlearning via Randomized Conditionally Independent Hessians
Deep Unlearning via Randomized Conditionally Independent Hessians
Ronak R. Mehta
Sourav Pal
Vikas Singh
Sathya Ravi
MU
27
81
0
15 Apr 2022
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
  Methods
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods
Aleksandr Beznosikov
Eduard A. Gorbunov
Hugo Berard
Nicolas Loizou
19
48
0
15 Feb 2022
Stochastic Extragradient: General Analysis and Improved Rates
Stochastic Extragradient: General Analysis and Improved Rates
Eduard A. Gorbunov
Hugo Berard
Gauthier Gidel
Nicolas Loizou
22
40
0
16 Nov 2021
Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants
  via the Mirror Stochastic Polyak Stepsize
Stochastic Mirror Descent: Convergence Analysis and Adaptive Variants via the Mirror Stochastic Polyak Stepsize
Ryan DÓrazio
Nicolas Loizou
I. Laradji
Ioannis Mitliagkas
34
30
0
28 Oct 2021
SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and
  Interpolation
SGD for Structured Nonconvex Functions: Learning Rates, Minibatching and Interpolation
Robert Mansel Gower
Othmane Sebbouh
Nicolas Loizou
25
74
0
18 Jun 2020
Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast
  Convergence
Stochastic Polyak Step-size for SGD: An Adaptive Learning Rate for Fast Convergence
Nicolas Loizou
Sharan Vaswani
I. Laradji
Simon Lacoste-Julien
27
181
0
24 Feb 2020
Momentum and Stochastic Momentum for Stochastic Gradient, Newton,
  Proximal Point and Subspace Descent Methods
Momentum and Stochastic Momentum for Stochastic Gradient, Newton, Proximal Point and Subspace Descent Methods
Nicolas Loizou
Peter Richtárik
19
199
0
27 Dec 2017
1