ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.08117
  4. Cited By
Neural networks with linear threshold activations: structure and
  algorithms

Neural networks with linear threshold activations: structure and algorithms

15 November 2021
Sammy Khalife
Hongyu Cheng
A. Basu
ArXivPDFHTML

Papers citing "Neural networks with linear threshold activations: structure and algorithms"

12 / 12 papers shown
Title
On the Expressiveness of Rational ReLU Neural Networks With Bounded Depth
Gennadiy Averkov
Christopher Hojny
Maximilian Merkert
89
3
0
10 Feb 2025
On Minimal Depth in Neural Networks
On Minimal Depth in Neural Networks
J. L. Valerdi
40
3
0
23 Feb 2024
Sample Complexity of Algorithm Selection Using Neural Networks and Its
  Applications to Branch-and-Cut
Sample Complexity of Algorithm Selection Using Neural Networks and Its Applications to Branch-and-Cut
Hongyu Cheng
Sammy Khalife
Barbara Fiedorowicz
Amitabh Basu
9
1
0
04 Feb 2024
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
94
32
0
29 Apr 2023
Training Neural Networks is NP-Hard in Fixed Dimension
Training Neural Networks is NP-Hard in Fixed Dimension
Vincent Froese
Christoph Hertrich
48
4
0
29 Mar 2023
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice
  Polytopes
Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes
Christian Haase
Christoph Hertrich
Georg Loho
34
21
0
24 Feb 2023
Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete
Training Fully Connected Neural Networks is ∃R\exists\mathbb{R}∃R-Complete
Daniel Bertschinger
Christoph Hertrich
Paul Jungeblut
Tillmann Miltzow
Simon Weber
OffRL
59
30
0
04 Apr 2022
Towards Lower Bounds on the Depth of ReLU Neural Networks
Towards Lower Bounds on the Depth of ReLU Neural Networks
Christoph Hertrich
A. Basu
M. D. Summa
M. Skutella
34
41
0
31 May 2021
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow
  Computation
ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation
Christoph Hertrich
Leon Sering
32
10
0
12 Feb 2021
Provably Good Solutions to the Knapsack Problem via Neural Networks of
  Bounded Size
Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size
Christoph Hertrich
M. Skutella
50
21
0
28 May 2020
Principled Deep Neural Network Training through Linear Programming
Principled Deep Neural Network Training through Linear Programming
D. Bienstock
Gonzalo Muñoz
Sebastian Pokutta
27
24
0
07 Oct 2018
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1