ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0803.2931
116
71

Extensions of smoothing via taut strings

20 March 2008
L. Duembgen
A. Kovac
ArXivPDFHTML
Abstract

Suppose that we observe independent random pairs (X1,Y1)(X_1,Y_1)(X1​,Y1​), (X2,Y2)(X_2,Y_2)(X2​,Y2​), >..., (Xn,Yn)(X_n,Y_n)(Xn​,Yn​). Our goal is to estimate regression functions such as the conditional mean or β\betaβ--quantile of YYY given XXX, where 0<β<10<\beta <10<β<1. In order to achieve this we minimize criteria such as, for instance, \sum_{i=1}^n \rho(f(X_i) - Y_i) + \lambda \cdot \mathop TV\nolimits (f) among all candidate functions fff. Here ρ\rhoρ is some convex function depending on the particular regression function we have in mind, TV(f)\mathop {\rm TV}\nolimits (f)TV(f) stands for the total variation of fff, and λ>0\lambda >0λ>0 is some tuning parameter. This framework is extended further to include binary or Poisson regression, and to include localized total variation penalties. The latter are needed to construct estimators adapting to inhomogeneous smoothness of fff. For the general framework we develop noniterative algorithms for the solution of the minimization problems which are closely related to the taut string algorithm (cf. Davies and Kovac, 2001). Further we establish a connection between the present setting and monotone regression, extending previous work by Mammen and van de Geer (1997). The algorithmic considerations and numerical examples are complemented by two consistency results.

View on arXiv
Comments on this paper