Asymptotic Normality of Support Vector Machines for Classi?cation and Regression

Abstract
In nonparametric classification and regression problems, support vector machines (SVMs) attract much attention in theoretical and in applied statistics. In an abstract sense, SVMs can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. In the article, it is shown that the difference between the empirical SVM and the theoretical SVM is asymptotically normal with rate . That is, the standardized difference converges weakly to a Gaussian process in the reproducing kernel Hilbert space. This is done by an application of the functional delta-method and by showing that the SVM-functional is suitably Hadamard-differentiable.
View on arXivComments on this paper