ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.05187
14
93

Smoothed Quantile Regression with Large-Scale Inference

9 December 2020
Xuming He
Xiaoou Pan
Kean Ming Tan
Wen-Xin Zhou
ArXivPDFHTML
Abstract

Quantile regression is a powerful tool for learning the relationship between a response variable and a multivariate predictor while exploring heterogeneous effects. In this paper, we consider statistical inference for quantile regression with large-scale data in the "increasing dimension" regime. We provide a comprehensive and in-depth analysis of a convolution-type smoothing approach that achieves adequate approximation to computation and inference for quantile regression. This method, which we refer to as {\it{conquer}}, turns the non-differentiable quantile loss function into a twice-differentiable, convex and locally strongly convex surrogate, which admits a fast and scalable Barzilai-Borwein gradient-based algorithm to perform optimization, and multiplier bootstrap for statistical inference. Theoretically, we establish explicit non-asymptotic bounds on both estimation and Bahadur-Kiefer linearization errors, from which we show that the asymptotic normality of the conquer estimator holds under a weaker requirement on the number of the regressors than needed for conventional quantile regression. Moreover, we prove the validity of multiplier bootstrap confidence constructions. Our numerical studies confirm the conquer estimator as a practical and reliable approach to large-scale inference for quantile regression. Software implementing the methodology is available in the \texttt{R} package \texttt{conquer}.

View on arXiv
Comments on this paper