ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.07667
32
0

S2R-HDR: A Large-Scale Rendered Dataset for HDR Fusion

10 April 2025
Yujin Wang
Jiarui Wu
Yichen Bian
Fan Zhang
Tianfan Xue
ArXivPDFHTML
Abstract

The generalization of learning-based high dynamic range (HDR) fusion is often limited by the availability of training data, as collecting large-scale HDR images from dynamic scenes is both costly and technically challenging. To address these challenges, we propose S2R-HDR, the first large-scale high-quality synthetic dataset for HDR fusion, with 24,000 HDR samples. Using Unreal Engine 5, we design a diverse set of realistic HDR scenes that encompass various dynamic elements, motion types, high dynamic range scenes, and lighting. Additionally, we develop an efficient rendering pipeline to generate realistic HDR images. To further mitigate the domain gap between synthetic and real-world data, we introduce S2R-Adapter, a domain adaptation designed to bridge this gap and enhance the generalization ability of models. Experimental results on real-world datasets demonstrate that our approach achieves state-of-the-art HDR reconstruction performance. Dataset and code will be available atthis https URL.

View on arXiv
@article{wang2025_2504.07667,
  title={ S2R-HDR: A Large-Scale Rendered Dataset for HDR Fusion },
  author={ Yujin Wang and Jiarui Wu and Yichen Bian and Fan Zhang and Tianfan Xue },
  journal={arXiv preprint arXiv:2504.07667},
  year={ 2025 }
}
Comments on this paper