ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.07400
21
9

Horseshoe Regularization for Feature Subset Selection

23 February 2017
A. Bhadra
J. Datta
Nicholas G. Polson
Brandon T. Willard
ArXivPDFHTML
Abstract

Feature subset selection arises in many high-dimensional applications of statistics, such as compressed sensing and genomics. The ℓ0\ell_0ℓ0​ penalty is ideal for this task, the caveat being it requires the NP-hard combinatorial evaluation of all models. A recent area of considerable interest is to develop efficient algorithms to fit models with a non-convex ℓγ\ell_\gammaℓγ​ penalty for γ∈(0,1)\gamma\in (0,1)γ∈(0,1), which results in sparser models than the convex ℓ1\ell_1ℓ1​ or lasso penalty, but is harder to fit. We propose an alternative, termed the horseshoe regularization penalty for feature subset selection, and demonstrate its theoretical and computational advantages. The distinguishing feature from existing non-convex optimization approaches is a full probabilistic representation of the penalty as the negative of the logarithm of a suitable prior, which in turn enables efficient expectation-maximization and local linear approximation algorithms for optimization and MCMC for uncertainty quantification. In synthetic and real data, the resulting algorithms provide better statistical performance, and the computation requires a fraction of time of state-of-the-art non-convex solvers.

View on arXiv
Comments on this paper