Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1606.00305
Cited By
Improving Deep Neural Network with Multiple Parametric Exponential Linear Units
1 June 2016
Yang Li
Chunxiao Fan
Yong Li
Qiong Wu
Yue Ming
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Deep Neural Network with Multiple Parametric Exponential Linear Units"
5 / 5 papers shown
Title
How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Ameya Dilip Jagtap
George Karniadakis
AI4CE
37
71
0
06 Sep 2022
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
643
0
29 Sep 2021
L*ReLU: Piece-wise Linear Activation Functions for Deep Fine-grained Visual Categorization
Mina Basirat
P. Roth
19
8
0
27 Oct 2019
Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks
Yang Liu
Jianpeng Zhang
Chao Gao
Jinghua Qu
Lixin Ji
FAtt
27
23
0
10 Aug 2019
The History Began from AlexNet: A Comprehensive Survey on Deep Learning Approaches
Md. Zahangir Alom
T. Taha
C. Yakopcic
Stefan Westberg
P. Sidike
Mst Shamima Nasrin
B. Van Essen
A. Awwal
V. Asari
VLM
29
874
0
03 Mar 2018
1