ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1504.02191
  4. Cited By
`local' vs. `global' parameters -- breaking the gaussian complexity
  barrier

`local' vs. `global' parameters -- breaking the gaussian complexity barrier

9 April 2015
S. Mendelson
ArXivPDFHTML

Papers citing "`local' vs. `global' parameters -- breaking the gaussian complexity barrier"

4 / 4 papers shown
Title
Do we really need the Rademacher complexities?
Do we really need the Rademacher complexities?
Daniel Bartl
S. Mendelson
68
0
0
24 Feb 2025
Convergence rates of least squares regression estimators with
  heavy-tailed errors
Convergence rates of least squares regression estimators with heavy-tailed errors
Q. Han
J. Wellner
15
44
0
07 Jun 2017
Learning without Concentration for General Loss Functions
Learning without Concentration for General Loss Functions
S. Mendelson
63
65
0
13 Oct 2014
Learning without Concentration
Learning without Concentration
S. Mendelson
85
334
0
01 Jan 2014
1