ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.12954
  4. Cited By
A Method on Searching Better Activation Functions

A Method on Searching Better Activation Functions

19 May 2024
Haoyuan Sun
Zihao Wu
Bo Xia
Pu Chang
Zibin Dong
Yifu Yuan
Yongzhe Chang
Xueqian Wang
ArXivPDFHTML

Papers citing "A Method on Searching Better Activation Functions"

3 / 3 papers shown
Title
Zorro: A Flexible and Differentiable Parametric Family of Activation
  Functions That Extends ReLU and GELU
Zorro: A Flexible and Differentiable Parametric Family of Activation Functions That Extends ReLU and GELU
Matias Roodschild
Jorge Gotay-Sardiñas
V. Jimenez
A. Will
27
0
0
28 Sep 2024
Activation function optimization method: Learnable series linear units
  (LSLUs)
Activation function optimization method: Learnable series linear units (LSLUs)
Chuan Feng
Xi Lin
Shiping Zhu
Hongkang Shi
Maojie Tang
Hua Huang
40
0
0
28 Aug 2024
Why Rectified Power Unit Networks Fail and How to Improve It: An
  Effective Theory Perspective
Why Rectified Power Unit Networks Fail and How to Improve It: An Effective Theory Perspective
Taeyoung Kim
Myungjoo Kang
35
0
0
04 Aug 2024
1