ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.07399
  4. Cited By
LipschitzLR: Using theoretically computed adaptive learning rates for
  fast convergence

LipschitzLR: Using theoretically computed adaptive learning rates for fast convergence

20 February 2019
Rahul Yedida
Snehanshu Saha
Tejas Prashanth
    ODL
ArXivPDFHTML

Papers citing "LipschitzLR: Using theoretically computed adaptive learning rates for fast convergence"

5 / 5 papers shown
Title
Machine learning tools to improve nonlinear modeling parameters of RC
  columns
Machine learning tools to improve nonlinear modeling parameters of RC columns
Hamid Khodadadi Koodiani
Elahe Jafari
Arsalan Majlesi
Mohammad Shahin
A. Matamoros
A. Alaeddini
14
9
0
09 Mar 2023
AdaSwarm: Augmenting Gradient-Based optimizers in Deep Learning with
  Swarm Intelligence
AdaSwarm: Augmenting Gradient-Based optimizers in Deep Learning with Swarm Intelligence
Rohan Mohapatra
Snehanshu Saha
C. Coello
Anwesh Bhattacharya
S. Dhavala
S. Saha
ODL
18
21
0
19 May 2020
CProp: Adaptive Learning Rate Scaling from Past Gradient Conformity
CProp: Adaptive Learning Rate Scaling from Past Gradient Conformity
Konpat Preechakul
B. Kijsirikul
ODL
28
3
0
24 Dec 2019
Evolution of Novel Activation Functions in Neural Network Training with
  Applications to Classification of Exoplanets
Evolution of Novel Activation Functions in Neural Network Training with Applications to Classification of Exoplanets
Snehanshu Saha
N. Nagaraj
Mathur Archana
Rahul Yedida
14
9
0
01 Jun 2019
A disciplined approach to neural network hyper-parameters: Part 1 --
  learning rate, batch size, momentum, and weight decay
A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay
L. Smith
208
1,019
0
26 Mar 2018
1