ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1307.7192
  4. Cited By
MixedGrad: An O(1/T) Convergence Rate Algorithm for Stochastic Smooth
  Optimization

MixedGrad: An O(1/T) Convergence Rate Algorithm for Stochastic Smooth Optimization

26 July 2013
M. Mahdavi
R. L. Jin
ArXivPDFHTML

Papers citing "MixedGrad: An O(1/T) Convergence Rate Algorithm for Stochastic Smooth Optimization"

5 / 5 papers shown
Title
SVRG Meets AdaGrad: Painless Variance Reduction
SVRG Meets AdaGrad: Painless Variance Reduction
Benjamin Dubois-Taine
Sharan Vaswani
Reza Babanezhad
Mark W. Schmidt
Simon Lacoste-Julien
18
18
0
18 Feb 2021
Semi-Stochastic Gradient Descent Methods
Semi-Stochastic Gradient Descent Methods
Jakub Konecný
Peter Richtárik
ODL
60
237
0
05 Dec 2013
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark W. Schmidt
Nicolas Le Roux
Francis R. Bach
55
1,243
0
10 Sep 2013
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
101
570
0
08 Dec 2012
Optimal Distributed Online Prediction using Mini-Batches
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
177
683
0
07 Dec 2010
1