ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.17924
21
2

Neural Light Spheres for Implicit Image Stitching and View Synthesis

26 September 2024
Ilya Chugunov
Amogh Joshi
Kiran Murthy
Francois Bleibel
Felix Heide
ArXivPDFHTML
Abstract

Challenging to capture, and challenging to display on a cellphone screen, the panorama paradoxically remains both a staple and underused feature of modern mobile camera applications. In this work we address both of these challenges with a spherical neural light field model for implicit panoramic image stitching and re-rendering; able to accommodate for depth parallax, view-dependent lighting, and local scene motion and color changes during capture. Fit during test-time to an arbitrary path panoramic video capture -- vertical, horizontal, random-walk -- these neural light spheres jointly estimate the camera path and a high-resolution scene reconstruction to produce novel wide field-of-view projections of the environment. Our single-layer model avoids expensive volumetric sampling, and decomposes the scene into compact view-dependent ray offset and color components, with a total model size of 80 MB per scene, and real-time (50 FPS) rendering at 1080p resolution. We demonstrate improved reconstruction quality over traditional image stitching and radiance field methods, with significantly higher tolerance to scene motion and non-ideal capture settings.

View on arXiv
@article{chugunov2025_2409.17924,
  title={ Neural Light Spheres for Implicit Image Stitching and View Synthesis },
  author={ Ilya Chugunov and Amogh Joshi and Kiran Murthy and Francois Bleibel and Felix Heide },
  journal={arXiv preprint arXiv:2409.17924},
  year={ 2025 }
}
Comments on this paper