Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1812.06247
Cited By
Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
15 December 2018
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
Sai Raj Kishore Perla
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning"
4 / 4 papers shown
Title
Bayesian optimization for sparse neural networks with trainable activation functions
M. Fakhfakh
Lotfi Chaari
17
2
0
10 Apr 2023
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
641
0
29 Sep 2021
Parametric Flatten-T Swish: An Adaptive Non-linear Activation Function For Deep Learning
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
21
6
0
06 Nov 2020
A survey on modern trainable activation functions
Andrea Apicella
Francesco Donnarumma
Francesco Isgrò
R. Prevete
31
365
0
02 May 2020
1