Reconstruction of a function from noisy data is often formulated as a regularized optimization problem over a possibly infinite-dimensional reproducing kernel Hilbert space (RKHS). In particular, the solution has to describe the observed data set and also have a small RKHS norm. When the data fit is measured using a quadratic loss, this type of estimator has a known interpretation in terms of Bayesian estimation of Gaussian random fields: it provides the minimum variance estimate of the unknown function given the noisy measurements. In this paper, we provide the exact Bayesian connection when more general convex losses are used, such as Vapnik or Huber. In particular, we show that the estimate in the RKHS contains all the possible finite dimensional maximum a posteriori estimates of the Gaussian random field.
View on arXiv