The Resistance to Label Noise in K-NN and CNN Depends on its Concentration

We investigate the multi-class classification performance of K-Nearest Neighbors (K-NN) and Convolutional Neural Networks (CNNs) in the presence of label noise. We first show empirically that a CNN's prediction for a given test sample depends on the labels of the training samples in its local neighborhood. This motivates us to derive a realizable analytic expression that approximates the multi-class K-NN classification error in the presence of label noise, which is of independent importance. We then suggest that the expression for K-NN may serve as a first-order approximation for the CNN error. Finally, we demonstrate empirically the proximity of the developed expression to the observed performance of K-NN and CNN classifiers. Our results may explain the already observed surprising resistance of CNNs to some types of label noise. In particular, it charcterizes an important factor in this resistance, by showing that the more concentrated the noise is (in the data), the greater the degration in performance.
View on arXiv