ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04060
  4. Cited By
On Approximation Capabilities of ReLU Activation and Softmax Output
  Layer in Neural Networks

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

10 February 2020
Behnam Asadi
Hui Jiang
ArXiv (abs)PDFHTML

Papers citing "On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks"

1 / 1 papers shown
Title
Understanding Deep Neural Networks with Rectified Linear Units
Understanding Deep Neural Networks with Rectified Linear Units
R. Arora
A. Basu
Poorya Mianjy
Anirbit Mukherjee
PINN
154
643
0
04 Nov 2016
1