ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.05304
12
6

Predictive Power of Nearest Neighbors Algorithm under Random Perturbation

13 February 2020
Yue Xing
Qifan Song
Guang Cheng
ArXivPDFHTML
Abstract

We consider a data corruption scenario in the classical kkk Nearest Neighbors (kkk-NN) algorithm, that is, the testing data are randomly perturbed. Under such a scenario, the impact of corruption level on the asymptotic regret is carefully characterized. In particular, our theoretical analysis reveals a phase transition phenomenon that, when the corruption level ω\omegaω is below a critical order (i.e., small-ω\omegaω regime), the asymptotic regret remains the same; when it is beyond that order (i.e., large-ω\omegaω regime), the asymptotic regret deteriorates polynomially. Surprisingly, we obtain a negative result that the classical noise-injection approach will not help improve the testing performance in the beginning stage of the large-ω\omegaω regime, even in the level of the multiplicative constant of asymptotic regret. As a technical by-product, we prove that under different model assumptions, the pre-processed 1-NN proposed in \cite{xue2017achieving} will at most achieve a sub-optimal rate when the data dimension d>4d>4d>4 even if kkk is chosen optimally in the pre-processing step.

View on arXiv
Comments on this paper