2
0

Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning

Abstract

Multiple Instance Learning is the predominant method for Whole Slide Image classification in digital pathology, enabling the use of slide-level labels to supervise model training. Although MIL eliminates the tedious fine-grained annotation process for supervised learning, whether it can learn accurate bag- and instance-level classifiers remains a question. To address the issue, instance-level classifiers and instance masks were incorporated to ground the prediction on supporting patches. These methods, while practically improving the performance of MIL methods, may potentially introduce noisy labels. We propose to bridge the gap between commonly used MIL and fully supervised learning by augmenting both the bag- and instance-level learning processes with pseudo-label correction capabilities elicited from weak to strong generalization techniques. The proposed algorithm improves the performance of dual-level MIL algorithms on both bag- and instance-level predictions. Experiments on public pathology datasets showcase the advantage of the proposed methods.

View on arXiv
@article{shu2025_2505.12074,
  title={ Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning },
  author={ Chen Shu and Boyu Fu and Yiman Li and Ting Yin and Wenchuan Zhang and Jie Chen and Yuhao Yi and Hong Bu },
  journal={arXiv preprint arXiv:2505.12074},
  year={ 2025 }
}
Comments on this paper