ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.05980
  4. Cited By
Revise Saturated Activation Functions
v1v2 (latest)

Revise Saturated Activation Functions

18 February 2016
Bing Xu
Ruitong Huang
Mu Li
ArXiv (abs)PDFHTML

Papers citing "Revise Saturated Activation Functions"

6 / 6 papers shown
Title
MXNet: A Flexible and Efficient Machine Learning Library for
  Heterogeneous Distributed Systems
MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems
Tianqi Chen
Mu Li
Yutian Li
Min Lin
Naiyan Wang
Minjie Wang
Tianjun Xiao
Bing Xu
Chiyuan Zhang
Zheng Zhang
200
2,248
0
03 Dec 2015
Fast and Accurate Deep Network Learning by Exponential Linear Units
  (ELUs)
Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
Djork-Arné Clevert
Thomas Unterthiner
Sepp Hochreiter
307
5,536
0
23 Nov 2015
All you need is a good init
All you need is a good init
Dmytro Mishkin
Jirí Matas
ODL
96
612
0
19 Nov 2015
Empirical Evaluation of Rectified Activations in Convolutional Network
Empirical Evaluation of Rectified Activations in Convolutional Network
Bing Xu
Naiyan Wang
Tianqi Chen
Mu Li
142
2,914
0
05 May 2015
Batch Normalization: Accelerating Deep Network Training by Reducing
  Internal Covariate Shift
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Sergey Ioffe
Christian Szegedy
OOD
467
43,347
0
11 Feb 2015
Delving Deep into Rectifiers: Surpassing Human-Level Performance on
  ImageNet Classification
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
Kaiming He
Xinming Zhang
Shaoqing Ren
Jian Sun
VLM
347
18,654
0
06 Feb 2015
1