Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.04386
Cited By
ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions
9 September 2021
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions"
6 / 6 papers shown
Title
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with Multilayer Perceptrons
Farhad Pourkamali-Anaraki
38
5
0
16 Sep 2024
Swish-T : Enhancing Swish Activation with Tanh Bias for Improved Neural Network Performance
Youngmin Seo
Jinha Kim
Unsang Park
28
0
0
01 Jul 2024
ErfReLU: Adaptive Activation Function for Deep Neural Network
Ashish Rajanand
Pradeep Singh
15
11
0
02 Jun 2023
SMU: smooth activation function for deep networks using smoothing maximum technique
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
30
32
0
08 Nov 2021
SAU: Smooth activation function using convolution with approximate identities
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
21
6
0
27 Sep 2021
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
297
10,225
0
16 Nov 2016
1