ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.00904
  4. Cited By
Deep ReLU Networks Have Surprisingly Few Activation Patterns

Deep ReLU Networks Have Surprisingly Few Activation Patterns

3 June 2019
Boris Hanin
David Rolnick
ArXivPDFHTML

Papers citing "Deep ReLU Networks Have Surprisingly Few Activation Patterns"

6 / 56 papers shown
Title
Large Scale Model Predictive Control with Neural Networks and Primal
  Active Sets
Large Scale Model Predictive Control with Neural Networks and Primal Active Sets
Steven W. Chen
Tianyu Wang
Nikolay Atanasov
Vijay Kumar
M. Morari
17
88
0
23 Oct 2019
Finite Depth and Width Corrections to the Neural Tangent Kernel
Finite Depth and Width Corrections to the Neural Tangent Kernel
Boris Hanin
Mihai Nica
MDE
30
149
0
13 Sep 2019
Optimal Function Approximation with Relu Neural Networks
Optimal Function Approximation with Relu Neural Networks
Bo Liu
Yi Liang
25
33
0
09 Sep 2019
Greedy Shallow Networks: An Approach for Constructing and Training
  Neural Networks
Greedy Shallow Networks: An Approach for Constructing and Training Neural Networks
Anton Dereventsov
Armenak Petrosyan
Clayton Webster
15
9
0
24 May 2019
Deep Neural Network Approximation Theory
Deep Neural Network Approximation Theory
Dennis Elbrächter
Dmytro Perekrestenko
Philipp Grohs
Helmut Bölcskei
19
207
0
08 Jan 2019
Empirical Bounds on Linear Regions of Deep Rectifier Networks
Empirical Bounds on Linear Regions of Deep Rectifier Networks
Thiago Serra
Srikumar Ramalingam
8
42
0
08 Oct 2018
Previous
12