ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.14350
  4. Cited By
The convergence of the Stochastic Gradient Descent (SGD) : a
  self-contained proof

The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof

26 March 2021
Gabrel Turinici
    LRM
ArXivPDFHTML

Papers citing "The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof"

6 / 6 papers shown
Title
Optimal time sampling in physics-informed neural networks
Optimal time sampling in physics-informed neural networks
Gabriel Turinici
PINN
48
1
0
29 Apr 2024
Learning Continually on a Sequence of Graphs -- The Dynamical System Way
Learning Continually on a Sequence of Graphs -- The Dynamical System Way
Krishnan Raghavan
Prasanna Balaprakash
43
0
0
19 May 2023
A Scale-Independent Multi-Objective Reinforcement Learning with
  Convergence Analysis
A Scale-Independent Multi-Objective Reinforcement Learning with Convergence Analysis
Mohsen Amidzadeh
29
0
0
08 Feb 2023
Prediction intervals for neural network models using weighted asymmetric
  loss functions
Prediction intervals for neural network models using weighted asymmetric loss functions
Milo Grillo
Yunpeng Han
A. Werpachowska
13
1
0
09 Oct 2022
Analysis of Kinetic Models for Label Switching and Stochastic Gradient
  Descent
Analysis of Kinetic Models for Label Switching and Stochastic Gradient Descent
Martin Burger
Alex Rossi
32
1
0
01 Jul 2022
A Bregman Learning Framework for Sparse Neural Networks
A Bregman Learning Framework for Sparse Neural Networks
Leon Bungert
Tim Roith
Daniel Tenbrinck
Martin Burger
29
17
0
10 May 2021
1