ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.18523
36
0

End-to-End Deep Learning for Structural Brain Imaging: A Unified Framework

23 February 2025
Yao Su
Keqi Han
Mingjie Zeng
Lichao Sun
Liang Zhan
Carl Yang
Lifang He
Xiangnan Kong
ArXivPDFHTML
Abstract

Brain imaging analysis is fundamental in neuroscience, providing valuable insights into brain structure and function. Traditional workflows follow a sequential pipeline-brain extraction, registration, segmentation, parcellation, network generation, and classification-treating each step as an independent task. These methods rely heavily on task-specific training data and expert intervention to correct intermediate errors, making them particularly burdensome for high-dimensional neuroimaging data, where annotations and quality control are costly and time-consuming. We introduce UniBrain, a unified end-to-end framework that integrates all processing steps into a single optimization process, allowing tasks to interact and refine each other. Unlike traditional approaches that require extensive task-specific annotations, UniBrain operates with minimal supervision, leveraging only low-cost labels (i.e., classification and extraction) and a single labeled atlas. By jointly optimizing extraction, registration, segmentation, parcellation, network generation, and classification, UniBrain enhances both accuracy and computational efficiency while significantly reducing annotation effort. Experimental results demonstrate its superiority over existing methods across multiple tasks, offering a more scalable and reliable solution for neuroimaging analysis. Our code and data can be found atthis https URL

View on arXiv
@article{su2025_2502.18523,
  title={ End-to-End Deep Learning for Structural Brain Imaging: A Unified Framework },
  author={ Yao Su and Keqi Han and Mingjie Zeng and Lichao Sun and Liang Zhan and Carl Yang and Lifang He and Xiangnan Kong },
  journal={arXiv preprint arXiv:2502.18523},
  year={ 2025 }
}
Comments on this paper