ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.08400
16
74

Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers

26 May 2016
Veeranjaneyulu Sadhanala
Yu-Xiang Wang
R. Tibshirani
ArXivPDFHTML
Abstract

We consider the problem of estimating a function defined over nnn locations on a ddd-dimensional grid (having all side lengths equal to n1/dn^{1/d}n1/d). When the function is constrained to have discrete total variation bounded by CnC_nCn​, we derive the minimax optimal (squared) ℓ2\ell_2ℓ2​ estimation error rate, parametrized by nnn and CnC_nCn​. Total variation denoising, also known as the fused lasso, is seen to be rate optimal. Several simpler estimators exist, such as Laplacian smoothing and Laplacian eigenmaps. A natural question is: can these simpler estimators perform just as well? We prove that these estimators, and more broadly all estimators given by linear transformations of the input data, are suboptimal over the class of functions with bounded variation. This extends fundamental findings of Donoho and Johnstone [1998] on 1-dimensional total variation spaces to higher dimensions. The implication is that the computationally simpler methods cannot be used for such sophisticated denoising tasks, without sacrificing statistical accuracy. We also derive minimax rates for discrete Sobolev spaces over ddd-dimensional grids, which are, in some sense, smaller than the total variation function spaces. Indeed, these are small enough spaces that linear estimators can be optimal---and a few well-known ones are, such as Laplacian smoothing and Laplacian eigenmaps, as we show. Lastly, we investigate the problem of adaptivity of the total variation denoiser to these smaller Sobolev function spaces.

View on arXiv
Comments on this paper