10
0

Autoadaptive Medical Segment Anything Model

Tyler Ward
Meredith K. Owen
O'Kira Coleman
Brian Noehren
Abdullah-Al-Zubaer Imran
Main:8 Pages
2 Figures
Bibliography:3 Pages
3 Tables
Abstract

Medical image segmentation is a key task in the imaging workflow, influencing many image-based decisions. Traditional, fully-supervised segmentation models rely on large amounts of labeled training data, typically obtained through manual annotation, which can be an expensive, time-consuming, and error-prone process. This signals a need for accurate, automatic, and annotation-efficient methods of training these models. We propose ADA-SAM (automated, domain-specific, and adaptive segment anything model), a novel multitask learning framework for medical image segmentation that leverages class activation maps from an auxiliary classifier to guide the predictions of the semi-supervised segmentation branch, which is based on the Segment Anything (SAM) framework. Additionally, our ADA-SAM model employs a novel gradient feedback mechanism to create a learnable connection between the segmentation and classification branches by using the segmentation gradients to guide and improve the classification predictions. We validate ADA-SAM on real-world clinical data collected during rehabilitation trials, and demonstrate that our proposed method outperforms both fully-supervised and semi-supervised baselines by double digits in limited label settings. Our code is available at:this https URL.

View on arXiv
@article{ward2025_2507.01828,
  title={ Autoadaptive Medical Segment Anything Model },
  author={ Tyler Ward and Meredith K. Owen and O'Kira Coleman and Brian Noehren and Abdullah-Al-Zubaer Imran },
  journal={arXiv preprint arXiv:2507.01828},
  year={ 2025 }
}
Comments on this paper