ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.04259
16
14

Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization

4 February 2021
Mathieu Even
Laurent Massoulié
ArXivPDFHTML
Abstract

Dimension is an inherent bottleneck to some modern learning tasks, where optimization methods suffer from the size of the data. In this paper, we study non-isotropic distributions of data and develop tools that aim at reducing these dimensional costs by a dependency on an effective dimension rather than the ambient one. Based on non-asymptotic estimates of the metric entropy of ellipsoids -- that prove to generalize to infinite dimensions -- and on a chaining argument, our uniform concentration bounds involve an effective dimension instead of the global dimension, improving over existing results. We show the importance of taking advantage of non-isotropic properties in learning problems with the following applications: i) we improve state-of-the-art results in statistical preconditioning for communication-efficient distributed optimization, ii) we introduce a non-isotropic randomized smoothing for non-smooth optimization. Both applications cover a class of functions that encompasses empirical risk minization (ERM) for linear models.

View on arXiv
@article{even2025_2102.04259,
  title={ Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization },
  author={ Mathieu Even and Laurent Massoulié },
  journal={arXiv preprint arXiv:2102.04259},
  year={ 2025 }
}
Comments on this paper