ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1301.5288
54
22
v1v2v3 (latest)

The exact relationship between regularization in RKHS and Bayesian estimation of Gaussian random fields

22 January 2013
Aleksandr Aravkin
B. Bell
J. Burke
G. Pillonetto
ArXiv (abs)PDFHTML
Abstract

Reconstruction of a function from noisy data is often formulated as a regularized optimization problem over a possibly infinite-dimensional reproducing kernel Hilbert space (RKHS). In particular, the solution has to describe the observed data set and also have a small RKHS norm. When the data fit is measured using a quadratic loss, this type of estimator has a known interpretation in terms of Bayesian estimation of Gaussian random fields: it provides the minimum variance estimate of the unknown function given the noisy measurements. In this paper, we provide the exact Bayesian connection when more general convex losses are used, such as Vapnik or Huber. In particular, we show that the estimate in the RKHS contains all the possible finite dimensional maximum a posteriori estimates of the Gaussian random field.

View on arXiv
Comments on this paper