50
0

Robust Classification with Noisy Labels Based on Posterior Maximization

Abstract

Designing objective functions robust to label noise is crucial for real-world classification algorithms. In this paper, we investigate the robustness to label noise of an ff-divergence-based class of objective functions recently proposed for supervised classification, herein referred to as ff-PML. We show that, in the presence of label noise, any of the ff-PML objective functions can be corrected to obtain a neural network that is equal to the one learned with the clean dataset. Additionally, we propose an alternative and novel correction approach that, during the test phase, refines the posterior estimated by the neural network trained in the presence of label noise. Then, we demonstrate that, even if the considered ff-PML objective functions are not symmetric, they are robust to symmetric label noise for any choice of ff-divergence, without the need for any correction approach. This allows us to prove that the cross-entropy, which belongs to the ff-PML class, is robust to symmetric label noise. Finally, we show that such a class of objective functions can be used together with refined training strategies, achieving competitive performance against state-of-the-art techniques of classification with label noise.

View on arXiv
@article{novello2025_2504.06805,
  title={ Robust Classification with Noisy Labels Based on Posterior Maximization },
  author={ Nicola Novello and Andrea M. Tonello },
  journal={arXiv preprint arXiv:2504.06805},
  year={ 2025 }
}
Comments on this paper