ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1711.06424
  4. Cited By
A Resizable Mini-batch Gradient Descent based on a Multi-Armed Bandit

A Resizable Mini-batch Gradient Descent based on a Multi-Armed Bandit

17 November 2017
S. Cho
Sunghun Kang
Chang D. Yoo
ArXivPDFHTML

Papers citing "A Resizable Mini-batch Gradient Descent based on a Multi-Armed Bandit"

5 / 5 papers shown
Title
Three Factors Influencing Minima in SGD
Three Factors Influencing Minima in SGD
Stanislaw Jastrzebski
Zachary Kenton
Devansh Arpit
Nicolas Ballas
Asja Fischer
Yoshua Bengio
Amos Storkey
67
458
0
13 Nov 2017
Don't Decay the Learning Rate, Increase the Batch Size
Don't Decay the Learning Rate, Increase the Batch Size
Samuel L. Smith
Pieter-Jan Kindermans
Chris Ying
Quoc V. Le
ODL
93
990
0
01 Nov 2017
Striving for Simplicity: The All Convolutional Net
Striving for Simplicity: The All Convolutional Net
Jost Tobias Springenberg
Alexey Dosovitskiy
Thomas Brox
Martin Riedmiller
FAtt
168
4,653
0
21 Dec 2014
Practical recommendations for gradient-based training of deep
  architectures
Practical recommendations for gradient-based training of deep architectures
Yoshua Bengio
3DH
ODL
131
2,195
0
24 Jun 2012
Hybrid Deterministic-Stochastic Methods for Data Fitting
Hybrid Deterministic-Stochastic Methods for Data Fitting
M. Friedlander
Mark Schmidt
124
387
0
13 Apr 2011
1