Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1801.07145
Cited By
E-swish: Adjusting Activations to Different Network Depths
22 January 2018
Eric Alcaide
LLMSV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"E-swish: Adjusting Activations to Different Network Depths"
7 / 7 papers shown
Title
Swish-T : Enhancing Swish Activation with Tanh Bias for Improved Neural Network Performance
Youngmin Seo
Jinha Kim
Unsang Park
36
0
0
01 Jul 2024
Nonlinearity Enhanced Adaptive Activation Functions
David Yevick
25
1
0
29 Mar 2024
How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Ameya Dilip Jagtap
George Karniadakis
AI4CE
37
71
0
06 Sep 2022
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
43
643
0
29 Sep 2021
A survey on modern trainable activation functions
Andrea Apicella
Francesco Donnarumma
Francesco Isgrò
R. Prevete
36
366
0
02 May 2020
Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
Sai Raj Kishore Perla
22
41
0
15 Dec 2018
Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks
L. R. Sütfeld
Flemming Brieger
Holger Finger
S. Füllhase
G. Pipa
28
28
0
26 Jun 2018
1