20
0

Reliable Few-shot Learning under Dual Noises

Main:12 Pages
7 Figures
Bibliography:4 Pages
1 Tables
Appendix:1 Pages
Abstract

Recent advances in model pre-training give rise to task adaptation-based few-shot learning (FSL), where the goal is to adapt a pre-trained task-agnostic model for capturing task-specific knowledge with a few-labeled support samples of the targetthis http URL, existing approaches may still fail in the open world due to the inevitable in-distribution (ID) and out-of-distribution (OOD) noise from both support and query samples of the target task. With limited support samples available, i) the adverse effect of the dual noises can be severely amplified during task adaptation, and ii) the adapted model can produce unreliable predictions on query samples in the presence of the dual noises. In this work, we propose DEnoised Task Adaptation (DETA++) for reliable FSL. DETA++ uses a Contrastive Relevance Aggregation (CoRA) module to calculate image and region weights for support samples, based on which a clean prototype loss and a noise entropy maximization loss are proposed to achieve noise-robust task adaptation. Additionally,DETA++ employs a memory bank to store and refine clean regions for each inner-task class, based on which a Local Nearest Centroid Classifier (LocalNCC) is devised to yield noise-robust predictions on query samples. Moreover, DETA++ utilizes an Intra-class Region Swapping (IntraSwap) strategy to rectify ID class prototypes during task adaptation, enhancing the model's robustness to the dual noises. Extensive experiments demonstrate the effectiveness and flexibility of DETA++.

View on arXiv
@article{zhang2025_2506.16330,
  title={ Reliable Few-shot Learning under Dual Noises },
  author={ Ji Zhang and Jingkuan Song and Lianli Gao and Nicu Sebe and Heng Tao Shen },
  journal={arXiv preprint arXiv:2506.16330},
  year={ 2025 }
}
Comments on this paper