ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.02846
37
0

PBR-SR: Mesh PBR Texture Super Resolution from 2D Image Priors

3 June 2025
Yujin Chen
Yinyu Nie
Benjamin Ummenhofer
R. Birkl
Michael Paulitsch
Matthias Nießner
    SupR
ArXivPDFHTML
Abstract

We present PBR-SR, a novel method for physically based rendering (PBR) texture super resolution (SR). It outputs high-resolution, high-quality PBR textures from low-resolution (LR) PBR input in a zero-shot manner. PBR-SR leverages an off-the-shelf super-resolution model trained on natural images, and iteratively minimizes the deviations between super-resolution priors and differentiable renderings. These enhancements are then back-projected into the PBR map space in a differentiable manner to produce refined, high-resolution textures. To mitigate view inconsistencies and lighting sensitivity, which is common in view-based super-resolution, our method applies 2D prior constraints across multi-view renderings, iteratively refining the shared, upscaled textures. In parallel, we incorporate identity constraints directly in the PBR texture domain to ensure the upscaled textures remain faithful to the LR input. PBR-SR operates without any additional training or data requirements, relying entirely on pretrained image priors. We demonstrate that our approach produces high-fidelity PBR textures for both artist-designed and AI-generated meshes, outperforming both direct SR models application and prior texture optimization methods. Our results show high-quality outputs in both PBR and rendering evaluations, supporting advanced applications such as relighting.

View on arXiv
@article{chen2025_2506.02846,
  title={ PBR-SR: Mesh PBR Texture Super Resolution from 2D Image Priors },
  author={ Yujin Chen and Yinyu Nie and Benjamin Ummenhofer and Reiner Birkl and Michael Paulitsch and Matthias Nießner },
  journal={arXiv preprint arXiv:2506.02846},
  year={ 2025 }
}
Comments on this paper