ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.00531
32
12

Under-bagging Nearest Neighbors for Imbalanced Classification

1 September 2021
H. Hang
Yuchao Cai
Hanfang Yang
Zhouchen Lin
ArXivPDFHTML
Abstract

In this paper, we propose an ensemble learning algorithm called \textit{under-bagging kkk-nearest neighbors} (\textit{under-bagging kkk-NN}) for imbalanced classification problems. On the theoretical side, by developing a new learning theory analysis, we show that with properly chosen parameters, i.e., the number of nearest neighbors kkk, the expected sub-sample size sss, and the bagging rounds BBB, optimal convergence rates for under-bagging kkk-NN can be achieved under mild assumptions w.r.t.~the arithmetic mean (AM) of recalls. Moreover, we show that with a relatively small BBB, the expected sub-sample size sss can be much smaller than the number of training data nnn at each bagging round, and the number of nearest neighbors kkk can be reduced simultaneously, especially when the data are highly imbalanced, which leads to substantially lower time complexity and roughly the same space complexity. On the practical side, we conduct numerical experiments to verify the theoretical results on the benefits of the under-bagging technique by the promising AM performance and efficiency of our proposed algorithm.

View on arXiv
Comments on this paper