ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2201.06656
  4. Cited By
Generalization in Supervised Learning Through Riemannian Contraction

Generalization in Supervised Learning Through Riemannian Contraction

17 January 2022
L. Kozachkov
Patrick M. Wensing
Jean-Jacques E. Slotine
    MLT
ArXivPDFHTML

Papers citing "Generalization in Supervised Learning Through Riemannian Contraction"

5 / 5 papers shown
Title
Generalization Bounds for Label Noise Stochastic Gradient Descent
Generalization Bounds for Label Noise Stochastic Gradient Descent
Jung Eun Huh
Patrick Rebeschini
13
1
0
01 Nov 2023
Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic
  Gradient Descent
Uniform-in-Time Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent
Lingjiong Zhu
Mert Gurbuzbalaban
Anant Raj
Umut Simsekli
34
6
0
20 May 2023
Neural Networks Efficiently Learn Low-Dimensional Representations with
  SGD
Neural Networks Efficiently Learn Low-Dimensional Representations with SGD
Alireza Mousavi-Hosseini
Sejun Park
M. Girotti
Ioannis Mitliagkas
Murat A. Erdogdu
MLT
324
48
0
29 Sep 2022
Beyond Lipschitz: Sharp Generalization and Excess Risk Bounds for
  Full-Batch GD
Beyond Lipschitz: Sharp Generalization and Excess Risk Bounds for Full-Batch GD
Konstantinos E. Nikolakakis
Farzin Haddadpour
Amin Karbasi
Dionysios S. Kalogerias
43
17
0
26 Apr 2022
RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent
  Neural Networks
RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks
L. Kozachkov
Michaela Ennis
Jean-Jacques E. Slotine
22
18
0
16 Jun 2021
1