ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1509.04093
58
33
v1v2 (latest)

Sharp Oracle inequalities for square root regularization

14 September 2015
Benjamin Stucky
Sara van de Geer
ArXiv (abs)PDFHTML
Abstract

We study a set of regularization methods for high dimensional linear regression models. These penalized estimators have the square root of the residual sum of squared errors as the loss function, and any weakly decomposable norm as the penalty function. This fit measure is chosen because of its property that the estimator does not depend on the unknown standard deviation of the noise. A generalized weakly decomposable norm penalty is very useful in being able to deal with different underlying sparsity. We can choose a different sparsity inducing norm depending on how we want to interpret the unknown parameter vector β\betaβ. Structured sparsity norms, as defined in Micchelli et al. [12], are special cases in the set of weakly decomposable norms, therefore we also include the Square Root LASSO (Belloni et al. [2]), the Group Square Root LASSO (Bunea et al. [8]) and the Square Root SLOPE (Bogdan et al. [4]). For this collection of estimators our results provide sharp oracle inequalities and the Karush-Kuhn-Tucker conditions and discuss some examples of estimators.

View on arXiv
Comments on this paper