ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.15817
17
4

σσσ-Ridge: group regularized ridge regression via empirical Bayes noise level cross-validation

29 October 2020
Nikolaos Ignatiadis
Panagiotis Lolas
ArXivPDFHTML
Abstract

Features in predictive models are not exchangeable, yet common supervised models treat them as such. Here we study ridge regression when the analyst can partition the features into KKK groups based on external side-information. For example, in high-throughput biology, features may represent gene expression, protein abundance or clinical data and so each feature group represents a distinct modality. The analyst's goal is to choose optimal regularization parameters λ=(λ1,…,λK)\lambda = (\lambda_1, \dotsc, \lambda_K)λ=(λ1​,…,λK​) -- one for each group. In this work, we study the impact of λ\lambdaλ on the predictive risk of group-regularized ridge regression by deriving limiting risk formulae under a high-dimensional random effects model with p≍np\asymp np≍n as n→∞n \to \inftyn→∞. Furthermore, we propose a data-driven method for choosing λ\lambdaλ that attains the optimal asymptotic risk: The key idea is to interpret the residual noise variance σ2\sigma^2σ2, as a regularization parameter to be chosen through cross-validation. An empirical Bayes construction maps the one-dimensional parameter σ\sigmaσ to the KKK-dimensional vector of regularization parameters, i.e., σ↦λ^(σ)\sigma \mapsto \widehat{\lambda}(\sigma)σ↦λ(σ). Beyond its theoretical optimality, the proposed method is practical and runs as fast as cross-validated ridge regression without feature groups (K=1K=1K=1).

View on arXiv
Comments on this paper