ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.09766
16
0

Stability and Sharper Risk Bounds with Convergence Rate O(1/n2)O(1/n^2)O(1/n2)

13 October 2024
Bowei Zhu
Shaojie Li
Yong Liu
ArXivPDFHTML
Abstract

The sharpest known high probability excess risk bounds are up to O(1/n)O\left( 1/n \right)O(1/n) for empirical risk minimization and projected gradient descent via algorithmic stability (Klochkov \& Zhivotovskiy, 2021). In this paper, we show that high probability excess risk bounds of order up to O(1/n2)O\left( 1/n^2 \right)O(1/n2) are possible. We discuss how high probability excess risk bounds reach O(1/n2)O\left( 1/n^2 \right)O(1/n2) under strongly convexity, smoothness and Lipschitz continuity assumptions for empirical risk minimization, projected gradient descent and stochastic gradient descent. Besides, to the best of our knowledge, our high probability results on the generalization gap measured by gradients for nonconvex problems are also the sharpest.

View on arXiv
Comments on this paper