ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.04386
  4. Cited By
ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions

ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions

9 September 2021
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
ArXivPDFHTML

Papers citing "ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions"

6 / 6 papers shown
Title
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with
  Multilayer Perceptrons
Kolmogorov-Arnold Networks in Low-Data Regimes: A Comparative Study with Multilayer Perceptrons
Farhad Pourkamali-Anaraki
38
5
0
16 Sep 2024
Swish-T : Enhancing Swish Activation with Tanh Bias for Improved Neural
  Network Performance
Swish-T : Enhancing Swish Activation with Tanh Bias for Improved Neural Network Performance
Youngmin Seo
Jinha Kim
Unsang Park
28
0
0
01 Jul 2024
ErfReLU: Adaptive Activation Function for Deep Neural Network
ErfReLU: Adaptive Activation Function for Deep Neural Network
Ashish Rajanand
Pradeep Singh
15
11
0
02 Jun 2023
SMU: smooth activation function for deep networks using smoothing
  maximum technique
SMU: smooth activation function for deep networks using smoothing maximum technique
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
30
32
0
08 Nov 2021
SAU: Smooth activation function using convolution with approximate
  identities
SAU: Smooth activation function using convolution with approximate identities
Koushik Biswas
Sandeep Kumar
Shilpak Banerjee
A. Pandey
21
6
0
27 Sep 2021
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
297
10,225
0
16 Nov 2016
1