How many needles in the haystack? Adaptive inference and uncertainty
quantification for the horseshoe
We investigate the frequentist properties of Bayesian procedures for estimation and uncertainty quantification based on the horseshoe prior. We consider the sparse multivariate mean model and consider both the hierarchical Bayes method of putting a prior on the unknown sparsity level and the empirical Bayes method with the sparsity level estimated by maximum marginal likelihood. We show that both Bayesian techniques lead to rate-adaptive optimal posterior contraction. We also investigate the frequentist coverage of Bayesian credible sets resulting from the horseshoe prior, both when the sparsity level is set by an oracle and when it is set by hierarchical or empirical Bayes. We show that credible balls and marginal credible intervals have good frequentist coverage and optimal size if the sparsity level of the prior is set correctly. By general theory honest confidence sets cannot adapt in size to an unknown sparsity level. Accordingly the hierarchical and empirical Bayes credible sets based on the horseshoe prior are not honest over the full parameter space. We show that this is due to over-shrinkage for certain parameters and characterise the set of parameters for which credible balls and marginal credible intervals do give correct uncertainty quantification. In particular we show that the fraction of false discoveries by the marginal Bayesian procedure is controlled by a correct choice of cut-off.
View on arXiv