ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.18636
46
0

Asymmetric Duos: Sidekicks Improve Uncertainty

24 May 2025
Tim G. Zhou
Evan Shelhamer
Geoff Pleiss
    UQCV
ArXiv (abs)PDFHTML
Main:9 Pages
14 Figures
Bibliography:4 Pages
Appendix:11 Pages
Abstract

The go-to strategy to apply deep networks in settings where uncertainty informs decisions--ensembling multiple training runs with random initializations--is ill-suited for the extremely large-scale models and practical fine-tuning workflows of today. We introduce a new cost-effective strategy for improving the uncertainty quantification and downstream decisions of a large model (e.g. a fine-tuned ViT-B): coupling it with a less accurate but much smaller "sidekick" (e.g. a fine-tuned ResNet-34) with a fraction of the computational cost. We propose aggregating the predictions of this \emph{Asymmetric Duo} by simple learned weighted averaging. Surprisingly, despite their inherent asymmetry, the sidekick model almost never harms the performance of the larger model. In fact, across five image classification benchmarks and a variety of model architectures and training schemes (including soups), Asymmetric Duos significantly improve accuracy, uncertainty quantification, and selective classification metrics with only ∼10−20%{\sim}10-20\%∼10−20% more computation.

View on arXiv
@article{zhou2025_2505.18636,
  title={ Asymmetric Duos: Sidekicks Improve Uncertainty },
  author={ Tim G. Zhou and Evan Shelhamer and Geoff Pleiss },
  journal={arXiv preprint arXiv:2505.18636},
  year={ 2025 }
}
Comments on this paper