What Time Tells Us? An Explorative Study of Time Awareness Learned from Static Images

Time becomes visible through illumination changes in what we see. Inspired by this, in this paper we explore the potential to learn time awareness from static images, trying to answer: what time tells us? To this end, we first introduce a Time-Oriented Collection (TOC) dataset, which contains 130,906 images with reliable timestamps. Leveraging this dataset, we propose a Time-Image Contrastive Learning (TICL) approach to jointly model timestamps and related visual representations through cross-modal contrastive learning. We found that the proposed TICL, 1) not only achieves state-of-the-art performance on the timestamp estimation task, over various benchmark metrics, 2) but also, interestingly, though only seeing static images, the time-aware embeddings learned from TICL show strong capability in several time-aware downstream tasks such as time-based image retrieval, video scene classification, and time-aware image editing. Our findings suggest that time-related visual cues can be learned from static images and are beneficial for various vision tasks, laying a foundation for future research on understanding time-related visual context. Project page:this https URL.
View on arXiv@article{lin2025_2503.17899, title={ What Time Tells Us? An Explorative Study of Time Awareness Learned from Static Images }, author={ Dongheng Lin and Han Hu and Jianbo Jiao }, journal={arXiv preprint arXiv:2503.17899}, year={ 2025 } }