55
0

ReCA: A Parametric ReLU Composite Activation Function

Main:8 Pages
5 Figures
Bibliography:1 Pages
10 Tables
Appendix:1 Pages
Abstract

Activation functions have been shown to affect the performance of deep neural networks significantly. While the Rectified Linear Unit (ReLU) remains the dominant choice in practice, the optimal activation function for deep neural networks remains an open research question. In this paper, we propose a novel parametric activation function, ReCA, based on ReLU, which has been shown to outperform all baselines on state-of-the-art datasets using different complex neural network architectures.

View on arXiv
@article{chidiac2025_2504.08994,
  title={ ReCA: A Parametric ReLU Composite Activation Function },
  author={ John Chidiac and Danielle Azar },
  journal={arXiv preprint arXiv:2504.08994},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.