ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.06555
  4. Cited By
Deep Network Approximation: Beyond ReLU to Diverse Activation Functions

Deep Network Approximation: Beyond ReLU to Diverse Activation Functions

13 July 2023
Shijun Zhang
Jianfeng Lu
Hongkai Zhao
ArXivPDFHTML

Papers citing "Deep Network Approximation: Beyond ReLU to Diverse Activation Functions"

5 / 5 papers shown
Title
Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition
Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition
Nathanael Tepakbong
Ding-Xuan Zhou
Xiang Zhou
39
0
0
13 May 2025
Don't Fear Peculiar Activation Functions: EUAF and Beyond
Don't Fear Peculiar Activation Functions: EUAF and Beyond
Qianchao Wang
Shijun Zhang
Dong Zeng
Zhaoheng Xie
Hengtao Guo
Feng-Lei Fan
Tieyong Zeng
39
3
0
12 Jul 2024
Spatially Optimized Compact Deep Metric Learning Model for Similarity
  Search
Spatially Optimized Compact Deep Metric Learning Model for Similarity Search
Md. Farhadul Islam
Md. Tanzim Reza
Meem Arafat Manab
Mohammad Rakibul Hasan Mahin
Sarah Zabeen
Jannatun Noor
30
0
0
09 Apr 2024
Neural Network Architecture Beyond Width and Depth
Neural Network Architecture Beyond Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
3DV
MDE
33
13
0
19 May 2022
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
Zuowei Shen
Haizhao Yang
Shijun Zhang
101
115
0
28 Feb 2021
1