ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0710.3458
377
93

Bayesian variable selection for high dimensional generalized linear models: convergence rates of the fitted densities

18 October 2007
Wenxin Jiang
ArXivPDFHTML
Abstract

Bayesian variable selection has gained much empirical success recently in a variety of applications when the number KKK of explanatory variables (x1,...,xK)(x_1,...,x_K)(x1​,...,xK​) is possibly much larger than the sample size nnn. For generalized linear models, if most of the xjx_jxj​'s have very small effects on the response yyy, we show that it is possible to use Bayesian variable selection to reduce overfitting caused by the curse of dimensionality K≫nK\gg nK≫n. In this approach a suitable prior can be used to choose a few out of the many xjx_jxj​'s to model yyy, so that the posterior will propose probability densities ppp that are ``often close'' to the true density p∗p^*p∗ in some sense. The closeness can be described by a Hellinger distance between ppp and p∗p^*p∗ that scales at a power very close to n−1/2n^{-1/2}n−1/2, which is the ``finite-dimensional rate'' corresponding to a low-dimensional situation. These findings extend some recent work of Jiang [Technical Report 05-02 (2005) Dept. Statistics, Northwestern Univ.] on consistency of Bayesian variable selection for binary classification.

View on arXiv
Comments on this paper