ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1804.02763
  4. Cited By
Comparison of non-linear activation functions for deep neural networks
  on MNIST classification task

Comparison of non-linear activation functions for deep neural networks on MNIST classification task

8 April 2018
Dabal Pedamonti
ArXivPDFHTML

Papers citing "Comparison of non-linear activation functions for deep neural networks on MNIST classification task"

14 / 14 papers shown
Title
Efficient Quantum Convolutional Neural Networks for Image Classification: Overcoming Hardware Constraints
Efficient Quantum Convolutional Neural Networks for Image Classification: Overcoming Hardware Constraints
Peter Röseler
Oliver Schaudt
Helmut Berg
Christian Bauckhage
Matthias Koch
34
0
0
09 May 2025
Semi-Supervised Confidence-Level-based Contrastive Discrimination for
  Class-Imbalanced Semantic Segmentation
Semi-Supervised Confidence-Level-based Contrastive Discrimination for Class-Imbalanced Semantic Segmentation
Kangcheng Liu
19
14
0
28 Nov 2022
How important are activation functions in regression and classification?
  A survey, performance comparison, and future directions
How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Ameya Dilip Jagtap
George Karniadakis
AI4CE
37
71
0
06 Sep 2022
Simple and complex spiking neurons: perspectives and analysis in a
  simple STDP scenario
Simple and complex spiking neurons: perspectives and analysis in a simple STDP scenario
D. L. Manna
Alex Vicente-Sola
Paul Kirkland
Trevor Bihl
G. D. Caterina
30
21
0
28 Jun 2022
CGAN-EB: A Non-parametric Empirical Bayes Method for Crash Hotspot
  Identification Using Conditional Generative Adversarial Networks: A Simulated
  Crash Data Study
CGAN-EB: A Non-parametric Empirical Bayes Method for Crash Hotspot Identification Using Conditional Generative Adversarial Networks: A Simulated Crash Data Study
M. Zarei
B. Hellinga
P. Izadpanah
CML
11
11
0
13 Dec 2021
Beyond Periodicity: Towards a Unifying Framework for Activations in
  Coordinate-MLPs
Beyond Periodicity: Towards a Unifying Framework for Activations in Coordinate-MLPs
Sameera Ramasinghe
Simon Lucey
22
119
0
30 Nov 2021
Activation Functions in Deep Learning: A Comprehensive Survey and
  Benchmark
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
643
0
29 Sep 2021
MILR: Mathematically Induced Layer Recovery for Plaintext Space Error
  Correction of CNNs
MILR: Mathematically Induced Layer Recovery for Plaintext Space Error Correction of CNNs
Jonathan Ponader
S. Kundu
Yan Solihin
34
8
0
28 Oct 2020
Effects of the Nonlinearity in Activation Functions on the Performance
  of Deep Learning Models
Effects of the Nonlinearity in Activation Functions on the Performance of Deep Learning Models
N. Kulathunga
N. R. Ranasinghe
D. Vrinceanu
Zackary Kinsman
Lei Huang
Yunjiao Wang
6
4
0
14 Oct 2020
Review: Deep Learning in Electron Microscopy
Review: Deep Learning in Electron Microscopy
Jeffrey M. Ede
34
79
0
17 Sep 2020
SPLASH: Learnable Activation Functions for Improving Accuracy and
  Adversarial Robustness
SPLASH: Learnable Activation Functions for Improving Accuracy and Adversarial Robustness
Mohammadamin Tavakoli
Forest Agostinelli
Pierre Baldi
AAML
FAtt
36
39
0
16 Jun 2020
A survey on modern trainable activation functions
A survey on modern trainable activation functions
Andrea Apicella
Francesco Donnarumma
Francesco Isgrò
R. Prevete
36
365
0
02 May 2020
On the Impact of the Activation Function on Deep Neural Networks
  Training
On the Impact of the Activation Function on Deep Neural Networks Training
Soufiane Hayou
Arnaud Doucet
Judith Rousseau
ODL
9
195
0
19 Feb 2019
A Methodology for Automatic Selection of Activation Functions to Design
  Hybrid Deep Neural Networks
A Methodology for Automatic Selection of Activation Functions to Design Hybrid Deep Neural Networks
Alberto Marchisio
Muhammad Abdullah Hanif
Semeen Rehman
Maurizio Martina
Muhammad Shafique
27
11
0
27 Oct 2018
1