56
0

ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling

Abstract

The temporal complexity of electronic health record (EHR) data presents significant challenges for predicting clinical outcomes using machine learning. This paper proposes ChronoFormer, an innovative transformer based architecture specifically designed to encode and leverage temporal dependencies in longitudinal patient data. ChronoFormer integrates temporal embeddings, hierarchical attention mechanisms, and domain specific masking techniques. Extensive experiments conducted on three benchmark tasks mortality prediction, readmission prediction, and long term comorbidity onset demonstrate substantial improvements over current state of the art methods. Furthermore, detailed analyses of attention patterns underscore ChronoFormer's capability to capture clinically meaningful long range temporal relationships.

View on arXiv
@article{zhang2025_2504.07373,
  title={ ChronoFormer: Time-Aware Transformer Architectures for Structured Clinical Event Modeling },
  author={ Yuanyun Zhang and Shi Li },
  journal={arXiv preprint arXiv:2504.07373},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.