Minimum discrepancy principle strategy for choosing in -NN
regression
This paper presents a novel data-driven strategy to choose the hyperparameter in the -NN regression estimator. We treat the problem of choosing the hyperparameter as an iterative procedure (over ) and propose using an easily implemented in practice strategy based on the idea of early stopping and the minimum discrepancy principle. This model selection strategy is proven to be minimax-optimal, under the fixed-design assumption on covariates, over some smoothness function classes, for instance, the Lipschitz functions class on a bounded domain. After that, the novel strategy shows consistent simulation results on artificial and real-world data sets in comparison to other model selection strategies, such as the Hold-out method and generalized cross-validation. The novelty of the strategy comes from reducing the computational time of the model selection procedure while preserving the statistical (minimax) optimality of the resulting estimator. More precisely, given a sample of size , if one should choose among , the strategy reduces the computational time of the generalized cross-validation or Akaike's AIC criteria from to , where is the proposed (minimum discrepancy principle) value of the nearest neighbors.
View on arXiv