57
0

Fast Computation of Leave-One-Out Cross-Validation for kk-NN Regression

Abstract

We describe a fast computation method for leave-one-out cross-validation (LOOCV) for kk-nearest neighbours (kk-NN) regression. We show that, under a tie-breaking condition for nearest neighbours, the LOOCV estimate of the mean square error for kk-NN regression is identical to the mean square error of (k+1)(k+1)-NN regression evaluated on the training data, multiplied by the scaling factor (k+1)2/k2(k+1)^2/k^2. Therefore, to compute the LOOCV score, one only needs to fit (k+1)(k+1)-NN regression only once, and does not need to repeat training-validation of kk-NN regression for the number of training data. Numerical experiments confirm the validity of the fast computation method.

View on arXiv
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.