Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.14689
Cited By
Exploiting Adam-like Optimization Algorithms to Improve the Performance of Convolutional Neural Networks
26 March 2021
L. Nanni
Gianluca Maguolo
A. Lumini
ODL
MedIm
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Exploiting Adam-like Optimization Algorithms to Improve the Performance of Convolutional Neural Networks"
7 / 7 papers shown
Title
Towards Theoretically Understanding Why SGD Generalizes Better Than ADAM in Deep Learning
Pan Zhou
Jiashi Feng
Chao Ma
Caiming Xiong
Guosheng Lin
E. Weinan
65
231
0
12 Oct 2020
Deep Ensembles: A Loss Landscape Perspective
Stanislav Fort
Huiyi Hu
Balaji Lakshminarayanan
OOD
UQCV
69
624
0
05 Dec 2019
diffGrad: An Optimization Method for Convolutional Neural Networks
S. Dubey
Soumendu Chakraborty
Swalpa Kumar Roy
Snehasis Mukherjee
S. Singh
B. B. Chaudhuri
ODL
114
186
0
12 Sep 2019
Improving Generalization Performance by Switching from Adam to SGD
N. Keskar
R. Socher
ODL
64
522
0
20 Dec 2017
Cyclical Learning Rates for Training Neural Networks
L. Smith
ODL
118
2,515
0
03 Jun 2015
FaceNet: A Unified Embedding for Face Recognition and Clustering
Florian Schroff
Dmitry Kalenichenko
James Philbin
3DH
281
13,079
0
12 Mar 2015
ADADELTA: An Adaptive Learning Rate Method
Matthew D. Zeiler
ODL
113
6,619
0
22 Dec 2012
1