ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.11398
48
3
v1v2 (latest)

An error bound for Lasso and Group Lasso in high dimensions

21 December 2019
Antoine Dedieu
ArXiv (abs)PDFHTML
Abstract

We leverage recent advances in high-dimensional statistics to derive new L2 estimation upper bounds for Lasso and Group Lasso in high-dimensions. For Lasso, our bounds scale as (k∗/n)log⁡(p/k∗)(k^*/n) \log(p/k^*)(k∗/n)log(p/k∗)---n×pn\times pn×p is the size of the design matrix and k∗k^*k∗ the dimension of the ground truth β∗\boldsymbol{\beta}^*β∗---and match the optimal minimax rate. For Group Lasso, our bounds scale as (s∗/n)log⁡(G/s∗)+m∗/n(s^*/n) \log\left( G / s^* \right) + m^* / n(s∗/n)log(G/s∗)+m∗/n---GGG is the total number of groups and m∗m^*m∗ the number of coefficients in the s∗s^*s∗ groups which contain β∗\boldsymbol{\beta}^*β∗---and improve over existing results. We additionally show that when the signal is strongly group-sparse, Group Lasso is superior to Lasso.

View on arXiv
Comments on this paper