ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.07797
  4. Cited By
Optimal Rates for Learning with Nyström Stochastic Gradient Methods

Optimal Rates for Learning with Nyström Stochastic Gradient Methods

21 October 2017
Junhong Lin
Lorenzo Rosasco
ArXivPDFHTML

Papers citing "Optimal Rates for Learning with Nyström Stochastic Gradient Methods"

6 / 6 papers shown
Title
NYTRO: When Subsampling Meets Early Stopping
NYTRO: When Subsampling Meets Early Stopping
Tomás Angles
Raffaello Camoriano
Alessandro Rudi
Lorenzo Rosasco
45
32
0
19 Oct 2015
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
248
1,246
0
10 Sep 2013
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Revisiting the Nystrom Method for Improved Large-Scale Machine Learning
Alex Gittens
Michael W. Mahoney
91
414
0
07 Mar 2013
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
134
573
0
08 Dec 2012
On Some Extensions of Bernstein's Inequality for Self-adjoint Operators
On Some Extensions of Bernstein's Inequality for Self-adjoint Operators
Stanislav Minsker
85
151
0
22 Dec 2011
Online Learning as Stochastic Approximation of Regularization Paths
Online Learning as Stochastic Approximation of Regularization Paths
P. Tarres
Yuan Yao
65
94
0
29 Mar 2011
1