ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.08675
  4. Cited By
The Computational Complexity of ReLU Network Training Parameterized by
  Data Dimensionality

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

18 May 2021
Vincent Froese
Christoph Hertrich
R. Niedermeier
ArXivPDFHTML

Papers citing "The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality"

5 / 5 papers shown
Title
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
129
37
0
29 Apr 2023
Tight Hardness Results for Training Depth-2 ReLU Networks
Tight Hardness Results for Training Depth-2 ReLU Networks
Surbhi Goel
Adam R. Klivans
Pasin Manurangsi
Daniel Reichman
54
41
0
27 Nov 2020
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex
  Optimization Formulations for Two-layer Networks
Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-layer Networks
Mert Pilanci
Tolga Ergen
69
118
0
24 Feb 2020
Learning Two Layer Rectified Neural Networks in Polynomial Time
Learning Two Layer Rectified Neural Networks in Polynomial Time
Ainesh Bakshi
Rajesh Jayaram
David P. Woodruff
NoLa
151
69
0
05 Nov 2018
On Loss Functions for Deep Neural Networks in Classification
On Loss Functions for Deep Neural Networks in Classification
Katarzyna Janocha
Wojciech M. Czarnecki
UQCV
68
549
0
18 Feb 2017
1