ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.10105
  4. Cited By
Inexact SARAH Algorithm for Stochastic Optimization

Inexact SARAH Algorithm for Stochastic Optimization

25 November 2018
Lam M. Nguyen
K. Scheinberg
Martin Takáč
ArXivPDFHTML

Papers citing "Inexact SARAH Algorithm for Stochastic Optimization"

8 / 8 papers shown
Title
Variance Reduction Methods Do Not Need to Compute Full Gradients: Improved Efficiency through Shuffling
Variance Reduction Methods Do Not Need to Compute Full Gradients: Improved Efficiency through Shuffling
Daniil Medyakov
Gleb Molodtsov
S. Chezhegov
Alexey Rebrikov
Aleksandr Beznosikov
103
0
0
21 Feb 2025
Sarah Frank-Wolfe: Methods for Constrained Optimization with Best Rates
  and Practical Features
Sarah Frank-Wolfe: Methods for Constrained Optimization with Best Rates and Practical Features
Aleksandr Beznosikov
David Dobre
Gauthier Gidel
25
5
0
23 Apr 2023
Zeroth-Order Alternating Gradient Descent Ascent Algorithms for a Class
  of Nonconvex-Nonconcave Minimax Problems
Zeroth-Order Alternating Gradient Descent Ascent Algorithms for a Class of Nonconvex-Nonconcave Minimax Problems
Zi Xu
Ziqi Wang
Junlin Wang
Y. Dai
21
11
0
24 Nov 2022
SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum
  Cocoercive Variational Inequalities
SARAH-based Variance-reduced Algorithm for Stochastic Finite-sum Cocoercive Variational Inequalities
Aleksandr Beznosikov
Alexander Gasnikov
40
2
0
12 Oct 2022
Training Structured Neural Networks Through Manifold Identification and
  Variance Reduction
Training Structured Neural Networks Through Manifold Identification and Variance Reduction
Zih-Syuan Huang
Ching-pei Lee
AAML
48
9
0
05 Dec 2021
Random-reshuffled SARAH does not need a full gradient computations
Random-reshuffled SARAH does not need a full gradient computations
Aleksandr Beznosikov
Martin Takáč
26
7
0
26 Nov 2021
ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite
  Nonconvex Optimization
ProxSARAH: An Efficient Algorithmic Framework for Stochastic Composite Nonconvex Optimization
Nhan H. Pham
Lam M. Nguyen
Dzung Phan
Quoc Tran-Dinh
16
139
0
15 Feb 2019
Linear Convergence of Gradient and Proximal-Gradient Methods Under the
  Polyak-Łojasiewicz Condition
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,201
0
16 Aug 2016
1