ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.07564
  4. Cited By
ALReLU: A different approach on Leaky ReLU activation function to
  improve Neural Networks Performance

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

11 December 2020
S. Mastromichalakis
ArXivPDFHTML

Papers citing "ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance"

6 / 6 papers shown
Title
Pioneering Precision in Lumbar Spine MRI Segmentation with Advanced Deep
  Learning and Data Enhancement
Pioneering Precision in Lumbar Spine MRI Segmentation with Advanced Deep Learning and Data Enhancement
Istiak Ahmed
Md. Tanzim Hossain
Md. Zahirul Islam Nahid
Kazi Shahriar Sanjid
Md. Shakib Shahariar Junayed
M. M. Uddin
Mohammad Monirujjaman Khan
35
0
0
09 Sep 2024
RepAct: The Re-parameterizable Adaptive Activation Function
RepAct: The Re-parameterizable Adaptive Activation Function
Xian Wu
Qingchuan Tao
Shuang Wang
48
0
0
28 Jun 2024
Deep Multi-Task Learning for Malware Image Classification
Deep Multi-Task Learning for Malware Image Classification
A. Bensaoud
Jugal Kalita
37
33
0
09 May 2024
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer
  ReLU Neural Networks
Homotopy Relaxation Training Algorithms for Infinite-Width Two-Layer ReLU Neural Networks
Yahong Yang
Qipin Chen
Wenrui Hao
34
4
0
26 Sep 2023
Estimating fire Duration using regression methods
Estimating fire Duration using regression methods
Hansong Xiao
29
0
0
17 Aug 2023
Parametric Leaky Tanh: A New Hybrid Activation Function for Deep
  Learning
Parametric Leaky Tanh: A New Hybrid Activation Function for Deep Learning
S. Mastromichalakis
11
1
0
11 Aug 2023
1