64
0

TransGI: Real-Time Dynamic Global Illumination With Object-Centric Neural Transfer Model

Abstract

Neural rendering algorithms have revolutionized computer graphics, yet their impact on real-time rendering under arbitrary lighting conditions remains limited due to strict latency constraints in practical applications. The key challenge lies in formulating a compact yet expressive material representation. To address this, we propose TransGI, a novel neural rendering method for real-time, high-fidelity global illumination. It comprises an object-centric neural transfer model for material representation and a radiance-sharing lighting system for efficient illumination. Traditional BSDF representations and spatial neural material representations lack expressiveness, requiring thousands of ray evaluations to converge to noise-free colors. Conversely, real-time methods trade quality for efficiency by supporting only diffuse materials. In contrast, our object-centric neural transfer model achieves compactness and expressiveness through an MLP-based decoder and vertex-attached latent features, supporting glossy effects with low memory overhead. For dynamic, varying lighting conditions, we introduce local light probes capturing scene radiance, coupled with an across-probe radiance-sharing strategy for efficient probe generation. We implemented our method in a real-time rendering engine, combining compute shaders and CUDA-based neural networks. Experimental results demonstrate that our method achieves real-time performance of less than 10 ms to render a frame and significantly improved rendering quality compared to baseline methods.

View on arXiv
@article{deng2025_2506.09909,
  title={ TransGI: Real-Time Dynamic Global Illumination With Object-Centric Neural Transfer Model },
  author={ Yijie Deng and Lei Han and Lu Fang },
  journal={arXiv preprint arXiv:2506.09909},
  year={ 2025 }
}
Main:14 Pages
15 Figures
Bibliography:2 Pages
6 Tables
Comments on this paper