Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1902.07399
Cited By
LipschitzLR: Using theoretically computed adaptive learning rates for fast convergence
20 February 2019
Rahul Yedida
Snehanshu Saha
Tejas Prashanth
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"LipschitzLR: Using theoretically computed adaptive learning rates for fast convergence"
5 / 5 papers shown
Title
Machine learning tools to improve nonlinear modeling parameters of RC columns
Hamid Khodadadi Koodiani
Elahe Jafari
Arsalan Majlesi
Mohammad Shahin
A. Matamoros
A. Alaeddini
14
9
0
09 Mar 2023
AdaSwarm: Augmenting Gradient-Based optimizers in Deep Learning with Swarm Intelligence
Rohan Mohapatra
Snehanshu Saha
C. Coello
Anwesh Bhattacharya
S. Dhavala
S. Saha
ODL
18
21
0
19 May 2020
CProp: Adaptive Learning Rate Scaling from Past Gradient Conformity
Konpat Preechakul
B. Kijsirikul
ODL
28
3
0
24 Dec 2019
Evolution of Novel Activation Functions in Neural Network Training with Applications to Classification of Exoplanets
Snehanshu Saha
N. Nagaraj
Mathur Archana
Rahul Yedida
14
9
0
01 Jun 2019
A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay
L. Smith
208
1,019
0
26 Mar 2018
1