113
2
v1v2 (latest)

MGHF: Multi-Granular High-Frequency Perceptual Loss for Image Super-Resolution

Main:26 Pages
8 Figures
3 Tables
Abstract

While different variants of perceptual losses have been employed in super-resolution literature to synthesize more realistic, appealing, and detailed high-resolution images, most are convolutional neural networks-based, causing information loss during guidance and often relying on complicated architectures and training procedures. We propose an invertible neural network (INN)-based naive \textbf{M}ulti-\textbf{G}ranular \textbf{H}igh-\textbf{F}requency (MGHF-n) perceptual loss trained on ImageNet to overcome these issues. Furthermore, we develop a comprehensive framework (MGHF-c) with several constraints to preserve, prioritize, and regularize information across multiple perspectives: texture and style preservation, content preservation, regional detail preservation, and joint content-style regularization. Information is prioritized through adaptive entropy-based pruning and reweighting of INN features. We utilize Gram matrix loss for style preservation and mean-squared error loss for content preservation. Additionally, we propose content-style consistency through correlation loss to regulate unnecessary texture generation while preserving content information. Since small image regions may contain intricate details, we employ modulated PatchNCE in the INN features as a local information preservation objective. Extensive experiments on various super-resolution algorithms, including GAN- and diffusion-based methods, demonstrate that our MGHF framework significantly improves performance. After the review process, our code will be released in the public repository.

View on arXiv
@article{sami2025_2411.13548,
  title={ MGHF: Multi-Granular High-Frequency Perceptual Loss for Image Super-Resolution },
  author={ Shoaib Meraj Sami and Md Mahedi Hasan and Mohammad Saeed Ebrahimi Saadabadi and Jeremy Dawson and Nasser Nasrabadi and Raghuveer Rao },
  journal={arXiv preprint arXiv:2411.13548},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.