ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.01235
47
0

Compensating Spatiotemporally Inconsistent Observations for Online Dynamic 3D Gaussian Splatting

2 May 2025
Youngsik Yun
Jeongmin Bae
Hyunseung Son
Seoha Kim
Hahyun Lee
G. Bang
Youngjung Uh
    3DGS
ArXivPDFHTML
Abstract

Online reconstruction of dynamic scenes is significant as it enables learning scenes from live-streaming video inputs, while existing offline dynamic reconstruction methods rely on recorded video inputs. However, previous online reconstruction approaches have primarily focused on efficiency and rendering quality, overlooking the temporal consistency of their results, which often contain noticeable artifacts in static regions. This paper identifies that errors such as noise in real-world recordings affect temporal inconsistency in online reconstruction. We propose a method that enhances temporal consistency in online reconstruction from observations with temporal inconsistency which is inevitable in cameras. We show that our method restores the ideal observation by subtracting the learned error. We demonstrate that applying our method to various baselines significantly enhances both temporal consistency and rendering quality across datasets. Code, video results, and checkpoints are available atthis https URL.

View on arXiv
@article{yun2025_2505.01235,
  title={ Compensating Spatiotemporally Inconsistent Observations for Online Dynamic 3D Gaussian Splatting },
  author={ Youngsik Yun and Jeongmin Bae and Hyunseung Son and Seoha Kim and Hahyun Lee and Gun Bang and Youngjung Uh },
  journal={arXiv preprint arXiv:2505.01235},
  year={ 2025 }
}
Comments on this paper