ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1409.8557
29
12

Statistical Theory for High-Dimensional Models

30 September 2014
Sara van de Geer
ArXivPDFHTML
Abstract

These lecture notes consist of three chapters. In the first chapter we present oracle inequalities for the prediction error of the Lasso and square-root Lasso and briefly describe the scaled Lasso. In the second chapter we establish asymptotic linearity of a de-sparsified Lasso. This implies asymptotic normality under certain conditions and therefore can be used to construct confidence intervals for parameters of interest. A similar line of reasoning can be invoked to derive bounds in sup-norm for the Lasso and asymptotic linearity of de-sparsified estimators of a precision matrix. In the third chapter we consider chaining and the more general generic chaining method developed by Talagrand. This allows one to bound suprema of random processes. Concentration inequalities are refined probability inequalities, mostly again for suprema of random processes. We combine the two. We prove a deviation inequality directly using (generic) chaining.

View on arXiv
Comments on this paper