Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.07564
Cited By
ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance
11 December 2020
S. Mastromichalakis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance"
6 / 6 papers shown
Title
Pioneering Precision in Lumbar Spine MRI Segmentation with Advanced Deep Learning and Data Enhancement
Istiak Ahmed
Md. Tanzim Hossain
Md. Zahirul Islam Nahid
Kazi Shahriar Sanjid
Md. Shakib Shahariar Junayed
M. M. Uddin
Mohammad Monirujjaman Khan
35
0
0
09 Sep 2024
RepAct: The Re-parameterizable Adaptive Activation Function
Xian Wu
Qingchuan Tao
Shuang Wang
48
0
0
28 Jun 2024
Deep Multi-Task Learning for Malware Image Classification
A. Bensaoud
Jugal Kalita
37
33
0
09 May 2024
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks
Yahong Yang
Qipin Chen
Wenrui Hao
34
4
0
26 Sep 2023
Estimating fire Duration using regression methods
Hansong Xiao
29
0
0
17 Aug 2023
Parametric Leaky Tanh: A New Hybrid Activation Function for Deep Learning
S. Mastromichalakis
11
1
0
11 Aug 2023
1