2
0

BandRC: Band Shifted Raised Cosine Activated Implicit Neural Representations

Abstract

In recent years, implicit neural representations(INRs) have gained popularity in the computer vision community. This is mainly due to the strong performance of INRs in many computer vision tasks. These networks can extract a continuous signal representation given a discrete signal representation. In previous studies, it has been repeatedly shown that INR performance has a strong correlation with the activation functions used in its multilayer perceptrons. Although numerous activation functions have been proposed that are competitive with one another, they share some common set of challenges such as spectral bias(Lack of sensitivity to high-frequency content in signals), limited robustness to signal noise and difficulties in simultaneous capturing both local and global features. and furthermore, the requirement for manual parameter tuning. To address these issues, we introduce a novel activation function, Band Shifted Raised Cosine Activated Implicit Neural Networks \textbf{(BandRC)} tailored to enhance signal representation capacity further. We also incorporate deep prior knowledge extracted from the signal to adjust the activation functions through a task-specific model. Through a mathematical analysis and a series of experiments which include image reconstruction (with a +8.93 dB PSNR improvement over the nearest counterpart), denoising (with a +0.46 dB increase in PSNR), super-resolution (with a +1.03 dB improvement over the nearest State-Of-The-Art (SOTA) method for 6X super-resolution), inpainting, and 3D shape reconstruction we demonstrate the dominance of BandRC over existing state of the art activation functions.

View on arXiv
@article{thennakoon2025_2505.11640,
  title={ BandRC: Band Shifted Raised Cosine Activated Implicit Neural Representations },
  author={ Pandula Thennakoon and Avishka Ranasinghe and Mario De Silva and Buwaneka Epakanda and Roshan Godaliyadda and Parakrama Ekanayake and Vijitha Herath },
  journal={arXiv preprint arXiv:2505.11640},
  year={ 2025 }
}
Comments on this paper