TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear
  Unit to Enhance Neural Networks
v1v2 (latest)

TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks

Papers citing "TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks"