ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1405.3080
  4. Cited By
Accelerating Minibatch Stochastic Gradient Descent using Stratified
  Sampling

Accelerating Minibatch Stochastic Gradient Descent using Stratified Sampling

13 May 2014
P. Zhao
Tong Zhang
ArXivPDFHTML

Papers citing "Accelerating Minibatch Stochastic Gradient Descent using Stratified Sampling"

5 / 5 papers shown
Title
Optimizing Chain-of-Thought Reasoners via Gradient Variance Minimization in Rejection Sampling and RL
Optimizing Chain-of-Thought Reasoners via Gradient Variance Minimization in Rejection Sampling and RL
Jiarui Yao
Yifan Hao
Hanning Zhang
Hanze Dong
Wei Xiong
Nan Jiang
Tong Zhang
LRM
86
1
0
05 May 2025
Importance Sampling for Minibatches
Importance Sampling for Minibatches
Dominik Csiba
Peter Richtárik
66
114
0
06 Feb 2016
Stochastic Optimization with Importance Sampling
Stochastic Optimization with Importance Sampling
P. Zhao
Tong Zhang
72
343
0
13 Jan 2014
Stochastic Gradient Descent for Non-smooth Optimization: Convergence
  Results and Optimal Averaging Schemes
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
136
573
0
08 Dec 2012
Making Gradient Descent Optimal for Strongly Convex Stochastic
  Optimization
Making Gradient Descent Optimal for Strongly Convex Stochastic Optimization
Alexander Rakhlin
Ohad Shamir
Karthik Sridharan
118
764
0
26 Sep 2011
1