ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.12400
20
0
v1v2 (latest)

Perceptual-GS: Scene-adaptive Perceptual Densification for Gaussian Splatting

14 June 2025
Hongbi Zhou
Zhangkai Ni
    3DGS
ArXiv (abs)PDFHTML
Main:9 Pages
16 Figures
Bibliography:3 Pages
39 Tables
Appendix:14 Pages
Abstract

3D Gaussian Splatting (3DGS) has emerged as a powerful technique for novel view synthesis. However, existing methods struggle to adaptively optimize the distribution of Gaussian primitives based on scene characteristics, making it challenging to balance reconstruction quality and efficiency. Inspired by human perception, we propose scene-adaptive perceptual densification for Gaussian Splatting (Perceptual-GS), a novel framework that integrates perceptual sensitivity into the 3DGS training process to address this challenge. We first introduce a perception-aware representation that models human visual sensitivity while constraining the number of Gaussian primitives. Building on this foundation, we develop a perceptual sensitivity-adaptive distribution to allocate finer Gaussian granularity to visually critical regions, enhancing reconstruction quality and robustness. Extensive evaluations on multiple datasets, including BungeeNeRF for large-scale scenes, demonstrate that Perceptual-GS achieves state-of-the-art performance in reconstruction quality, efficiency, and robustness. The code is publicly available at:this https URL

View on arXiv
@article{zhou2025_2506.12400,
  title={ Perceptual-GS: Scene-adaptive Perceptual Densification for Gaussian Splatting },
  author={ Hongbi Zhou and Zhangkai Ni },
  journal={arXiv preprint arXiv:2506.12400},
  year={ 2025 }
}
Comments on this paper