ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.06247
  4. Cited By
Flatten-T Swish: a thresholded ReLU-Swish-like activation function for
  deep learning

Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

15 December 2018
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
Sai Raj Kishore Perla
ArXivPDFHTML

Papers citing "Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning"

4 / 4 papers shown
Title
Bayesian optimization for sparse neural networks with trainable
  activation functions
Bayesian optimization for sparse neural networks with trainable activation functions
M. Fakhfakh
Lotfi Chaari
17
2
0
10 Apr 2023
Activation Functions in Deep Learning: A Comprehensive Survey and
  Benchmark
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
641
0
29 Sep 2021
Parametric Flatten-T Swish: An Adaptive Non-linear Activation Function
  For Deep Learning
Parametric Flatten-T Swish: An Adaptive Non-linear Activation Function For Deep Learning
Hock Hung Chieng
Noorhaniza Wahid
P. Ong
21
6
0
06 Nov 2020
A survey on modern trainable activation functions
A survey on modern trainable activation functions
Andrea Apicella
Francesco Donnarumma
Francesco Isgrò
R. Prevete
31
365
0
02 May 2020
1