ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.04876
  4. Cited By
HiGrad: Uncertainty Quantification for Online Learning and Stochastic Approximation

HiGrad: Uncertainty Quantification for Online Learning and Stochastic Approximation

13 February 2018
Weijie J. Su
Yuancheng Zhu
ArXivPDFHTML

Papers citing "HiGrad: Uncertainty Quantification for Online Learning and Stochastic Approximation"

5 / 5 papers shown
Title
An Analysis of Constant Step Size SGD in the Non-convex Regime:
  Asymptotic Normality and Bias
An Analysis of Constant Step Size SGD in the Non-convex Regime: Asymptotic Normality and Bias
Lu Yu
Krishnakumar Balasubramanian
S. Volgushev
Murat A. Erdogdu
79
50
0
14 Jun 2020
Statistical Inference for Model Parameters in Stochastic Gradient
  Descent
Statistical Inference for Model Parameters in Stochastic Gradient Descent
Xi Chen
Jason D. Lee
Xin T. Tong
Yichen Zhang
51
138
0
27 Oct 2016
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
112
1,817
0
01 Jul 2014
Degrees of freedom in lasso problems
Degrees of freedom in lasso problems
Robert Tibshirani
Jonathan E. Taylor
100
363
0
02 Nov 2011
Information-theoretic lower bounds on the oracle complexity of
  stochastic convex optimization
Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization
Alekh Agarwal
Peter L. Bartlett
Pradeep Ravikumar
Martin J. Wainwright
155
248
0
03 Sep 2010
1