ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.09529
  4. Cited By
Learning Activation Functions: A new paradigm for understanding Neural
  Networks

Learning Activation Functions: A new paradigm for understanding Neural Networks

23 June 2019
Mohit Goyal
R. Goyal
Brejesh Lall
ArXivPDFHTML

Papers citing "Learning Activation Functions: A new paradigm for understanding Neural Networks"

9 / 9 papers shown
Title
Learnable polynomial, trigonometric, and tropical activations
Learnable polynomial, trigonometric, and tropical activations
Ismail Khalfaoui-Hassani
Stefan Kesselheim
64
0
0
03 Feb 2025
KAN: Kolmogorov-Arnold Networks
KAN: Kolmogorov-Arnold Networks
Ziming Liu
Yixuan Wang
Sachin Vaidya
Fabian Ruehle
James Halverson
Marin Soljacic
Thomas Y. Hou
Max Tegmark
98
475
0
30 Apr 2024
Learning Specialized Activation Functions for Physics-informed Neural
  Networks
Learning Specialized Activation Functions for Physics-informed Neural Networks
Honghui Wang
Lu Lu
Shiji Song
Gao Huang
PINN
AI4CE
16
11
0
08 Aug 2023
Efficient Activation Function Optimization through Surrogate Modeling
Efficient Activation Function Optimization through Surrogate Modeling
G. Bingham
Risto Miikkulainen
16
2
0
13 Jan 2023
How important are activation functions in regression and classification?
  A survey, performance comparison, and future directions
How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Ameya Dilip Jagtap
George Karniadakis
AI4CE
37
71
0
06 Sep 2022
Activation Functions in Deep Learning: A Comprehensive Survey and
  Benchmark
Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark
S. Dubey
S. Singh
B. B. Chaudhuri
41
641
0
29 Sep 2021
The Representation Theory of Neural Networks
The Representation Theory of Neural Networks
M. Armenta
Pierre-Marc Jodoin
21
30
0
23 Jul 2020
Padé Activation Units: End-to-end Learning of Flexible Activation
  Functions in Deep Networks
Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks
Alejandro Molina
P. Schramowski
Kristian Kersting
ODL
23
77
0
15 Jul 2019
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
183
1,185
0
30 Nov 2014
1