Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.08117
Cited By
Neural networks with linear threshold activations: structure and algorithms
15 November 2021
Sammy Khalife
Hongyu Cheng
A. Basu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Neural networks with linear threshold activations: structure and algorithms"
12 / 12 papers shown
Title
On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth
Gennadiy Averkov
Christopher Hojny
Maximilian Merkert
89
3
0
10 Feb 2025
On Minimal Depth in Neural Networks
J. L. Valerdi
40
3
0
23 Feb 2024
Sample Complexity of Algorithm Selection Using Neural Networks and Its Applications to Branch-and-Cut
Hongyu Cheng
Sammy Khalife
Barbara Fiedorowicz
Amitabh Basu
9
1
0
04 Feb 2024
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
94
32
0
29 Apr 2023
Training Neural Networks is NP-Hard in Fixed Dimension
Vincent Froese
Christoph Hertrich
48
4
0
29 Mar 2023
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes
Christian Haase
Christoph Hertrich
Georg Loho
34
21
0
24 Feb 2023
Training Fully Connected Neural Networks is
∃
R
\exists\mathbb{R}
∃
R
-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
59
30
0
04 Apr 2022
Towards Lower Bounds on the Depth of ReLU Neural Networks
Christoph Hertrich
A. Basu
M. D. Summa
M. Skutella
34
41
0
31 May 2021
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation
Christoph Hertrich
Leon Sering
32
10
0
12 Feb 2021
Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size
Christoph Hertrich
M. Skutella
50
21
0
28 May 2020
Principled Deep Neural Network Training through Linear Programming
D. Bienstock
Gonzalo Muñoz
Sebastian Pokutta
27
24
0
07 Oct 2018
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1