ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.12977
57
2

IRIS: Inverse Rendering of Indoor Scenes from Low Dynamic Range Images

23 January 2024
Zhi-Hao Lin
Jia-Bin Huang
Zhengqin Li
Zhao Dong
Christian Richardt
Tuotuo Li
Michael Zollhöfer
Johannes Kopf
Shenlong Wang
Changil Kim
    3DV
ArXivPDFHTML
Abstract

Inverse rendering seeks to recover 3D geometry, surface material, and lighting from captured images, enabling advanced applications such as novel-view synthesis, relighting, and virtual object insertion. However, most existing techniques rely on high dynamic range (HDR) images as input, limiting accessibility for general users. In response, we introduce IRIS, an inverse rendering framework that recovers the physically based material, spatially-varying HDR lighting, and camera response functions from multi-view, low-dynamic-range (LDR) images. By eliminating the dependence on HDR input, we make inverse rendering technology more accessible. We evaluate our approach on real-world and synthetic scenes and compare it with state-of-the-art methods. Our results show that IRIS effectively recovers HDR lighting, accurate material, and plausible camera response functions, supporting photorealistic relighting and object insertion.

View on arXiv
@article{lin2025_2401.12977,
  title={ IRIS: Inverse Rendering of Indoor Scenes from Low Dynamic Range Images },
  author={ Chih-Hao Lin and Jia-Bin Huang and Zhengqin Li and Zhao Dong and Christian Richardt and Tuotuo Li and Michael Zollhöfer and Johannes Kopf and Shenlong Wang and Changil Kim },
  journal={arXiv preprint arXiv:2401.12977},
  year={ 2025 }
}
Comments on this paper