ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.10929
21
0

Cross-Frequency Implicit Neural Representation with Self-Evolving Parameters

15 April 2025
C. Yu
Yisi Luo
Kai Ye
Xile Zhao
Deyu Meng
ArXivPDFHTML
Abstract

Implicit neural representation (INR) has emerged as a powerful paradigm for visual data representation. However, classical INR methods represent data in the original space mixed with different frequency components, and several feature encoding parameters (e.g., the frequency parameter ω\omegaω or the rank RRR) need manual configurations. In this work, we propose a self-evolving cross-frequency INR using the Haar wavelet transform (termed CF-INR), which decouples data into four frequency components and employs INRs in the wavelet space. CF-INR allows the characterization of different frequency components separately, thus enabling higher accuracy for data representation. To more precisely characterize cross-frequency components, we propose a cross-frequency tensor decomposition paradigm for CF-INR with self-evolving parameters, which automatically updates the rank parameter RRR and the frequency parameter ω\omegaω for each frequency component through self-evolving optimization. This self-evolution paradigm eliminates the laborious manual tuning of these parameters, and learns a customized cross-frequency feature encoding configuration for each dataset. We evaluate CF-INR on a variety of visual data representation and recovery tasks, including image regression, inpainting, denoising, and cloud removal. Extensive experiments demonstrate that CF-INR outperforms state-of-the-art methods in each case.

View on arXiv
@article{yu2025_2504.10929,
  title={ Cross-Frequency Implicit Neural Representation with Self-Evolving Parameters },
  author={ Chang Yu and Yisi Luo and Kai Ye and Xile Zhao and Deyu Meng },
  journal={arXiv preprint arXiv:2504.10929},
  year={ 2025 }
}
Comments on this paper