ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.07783
  4. Cited By
A Novel Stochastic Stratified Average Gradient Method: Convergence Rate
  and Its Complexity
v1v2v3 (latest)

A Novel Stochastic Stratified Average Gradient Method: Convergence Rate and Its Complexity

21 October 2017
Aixiang Chen
Bingchuan Chen
Xiaolong Chai
Rui-Ling Bian
Hengguang Li
ArXiv (abs)PDFHTML

Papers citing "A Novel Stochastic Stratified Average Gradient Method: Convergence Rate and Its Complexity"

10 / 10 papers shown
Title
DeepSign: Deep Learning for Automatic Malware Signature Generation and
  Classification
DeepSign: Deep Learning for Automatic Malware Signature Generation and Classification
Omid David
N. Netanyahu
35
206
0
21 Nov 2017
Predicting Deeper into the Future of Semantic Segmentation
Predicting Deeper into the Future of Semantic Segmentation
Pauline Luc
Natalia Neverova
Camille Couprie
Jakob Verbeek
Yann LeCun
68
242
0
22 Mar 2017
An overview of gradient descent optimization algorithms
An overview of gradient descent optimization algorithms
Sebastian Ruder
ODL
204
6,199
0
15 Sep 2016
Very Deep Convolutional Networks for Text Classification
Very Deep Convolutional Networks for Text Classification
Alexis Conneau
Holger Schwenk
Loïc Barrault
Yann LeCun
AI4CE
63
320
0
06 Jun 2016
Very Deep Multilingual Convolutional Neural Networks for LVCSR
Very Deep Multilingual Convolutional Neural Networks for LVCSR
Tom Sercu
Christian Puhrsch
Brian Kingsbury
Yann LeCun
57
224
0
29 Sep 2015
Very Deep Convolutional Networks for Large-Scale Image Recognition
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan
Andrew Zisserman
FAttMDE
1.7K
100,386
0
04 Sep 2014
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly
  Convex Composite Objectives
SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Aaron Defazio
Francis R. Bach
Simon Lacoste-Julien
ODL
135
1,826
0
01 Jul 2014
Accelerating Minibatch Stochastic Gradient Descent using Stratified
  Sampling
Accelerating Minibatch Stochastic Gradient Descent using Stratified Sampling
P. Zhao
Tong Zhang
76
91
0
13 May 2014
Minimizing Finite Sums with the Stochastic Average Gradient
Minimizing Finite Sums with the Stochastic Average Gradient
Mark Schmidt
Nicolas Le Roux
Francis R. Bach
324
1,249
0
10 Sep 2013
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Better Mini-Batch Algorithms via Accelerated Gradient Methods
Andrew Cotter
Ohad Shamir
Nathan Srebro
Karthik Sridharan
ODL
120
315
0
22 Jun 2011
1