Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2105.08675
Cited By
The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality
18 May 2021
Vincent Froese
Christoph Hertrich
R. Niedermeier
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality"
5 / 5 papers shown
Title
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
129
37
0
29 Apr 2023
Tight Hardness Results for Training Depth-2 ReLU Networks
Surbhi Goel
Adam R. Klivans
Pasin Manurangsi
Daniel Reichman
54
41
0
27 Nov 2020
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-layer Networks
Mert Pilanci
Tolga Ergen
69
118
0
24 Feb 2020
Learning Two Layer Rectified Neural Networks in Polynomial Time
Ainesh Bakshi
Rajesh Jayaram
David P. Woodruff
NoLa
151
69
0
05 Nov 2018
On Loss Functions for Deep Neural Networks in Classification
Katarzyna Janocha
Wojciech M. Czarnecki
UQCV
68
549
0
18 Feb 2017
1