ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.03220
8
42

Fractional ridge regression: a fast, interpretable reparameterization of ridge regression

7 May 2020
Ariel S. Rokem
Kendrick Norris Kay
ArXivPDFHTML
Abstract

Ridge regression (RR) is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using RR is the need to set a hyperparameter (α\alphaα) that controls the amount of regularization. Cross-validation is typically used to select the best α\alphaα from a set of candidates. However, efficient and appropriate selection of α\alphaα can be challenging, particularly where large amounts of data are analyzed. Because the selected α\alphaα depends on the scale of the data and predictors, it is not straightforwardly interpretable. Here, we propose to reparameterize RR in terms of the ratio γ\gammaγ between the L2-norms of the regularized and unregularized coefficients. This approach, called fractional RR (FRR), has several benefits: the solutions obtained for different γ\gammaγ are guaranteed to vary, guarding against wasted calculations, and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. We provide an algorithm to solve FRR, as well as open-source software implementations in Python and MATLAB (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems, and delivers results that are straightforward to interpret and compare across models and datasets.

View on arXiv
Comments on this paper