ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1210.1092
55
29

Nearly root-n approximation for regression quantile processes

3 October 2012
S. Portnoy
ArXivPDFHTML
Abstract

Traditionally, assessing the accuracy of inference based on regression quantiles has relied on the Bahadur representation. This provides an error of order n−1/4n^{-1/4}n−1/4 in normal approximations, and suggests that inference based on regression quantiles may not be as reliable as that based on other (smoother) approaches, whose errors are generally of order n−1/2n^{-1/2}n−1/2 (or better in special symmetric cases). Fortunately, extensive simulations and empirical applications show that inference for regression quantiles shares the smaller error rates of other procedures. In fact, the "Hungarian" construction of Koml\'{o}s, Major and Tusn\'{a}dy [Z. Wahrsch. Verw. Gebiete 32 (1975) 111-131, Z. Wahrsch. Verw. Gebiete 34 (1976) 33-58] provides an alternative expansion for the one-sample quantile process with nearly the root-nnn error rate (specifically, to within a factor of log⁡n\log nlogn). Such an expansion is developed here to provide a theoretical foundation for more accurate approximations for inference in regression quantile models. One specific application of independent interest is a result establishing that for conditional inference, the error rate for coverage probabilities using the Hall and Sheather [J. R. Stat. Soc. Ser. B Stat. Methodol. 50 (1988) 381-391] method of sparsity estimation matches their one-sample rate.

View on arXiv
Comments on this paper