ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.13512
  4. Cited By
Learning Deep ReLU Networks Is Fixed-Parameter Tractable

Learning Deep ReLU Networks Is Fixed-Parameter Tractable

28 September 2020
Sitan Chen
Adam R. Klivans
Raghu Meka
ArXivPDFHTML

Papers citing "Learning Deep ReLU Networks Is Fixed-Parameter Tractable"

7 / 7 papers shown
Title
When Deep Learning Meets Polyhedral Theory: A Survey
When Deep Learning Meets Polyhedral Theory: A Survey
Joey Huchette
Gonzalo Muñoz
Thiago Serra
Calvin Tsay
AI4CE
117
36
0
29 Apr 2023
Learning and Generalization in Overparameterized Neural Networks, Going
  Beyond Two Layers
Learning and Generalization in Overparameterized Neural Networks, Going Beyond Two Layers
Zeyuan Allen-Zhu
Yuanzhi Li
Yingyu Liang
MLT
94
769
0
12 Nov 2018
Learning Two-layer Neural Networks with Symmetric Inputs
Learning Two-layer Neural Networks with Symmetric Inputs
Rong Ge
Rohith Kuditipudi
Zhize Li
Xiang Wang
OOD
MLT
82
57
0
16 Oct 2018
Learning One-hidden-layer Neural Networks with Landscape Design
Learning One-hidden-layer Neural Networks with Landscape Design
Rong Ge
Jason D. Lee
Tengyu Ma
MLT
86
260
0
01 Nov 2017
Recovery Guarantees for One-hidden-layer Neural Networks
Recovery Guarantees for One-hidden-layer Neural Networks
Kai Zhong
Zhao Song
Prateek Jain
Peter L. Bartlett
Inderjit S. Dhillon
MLT
84
336
0
10 Jun 2017
The generalized Lasso with non-linear observations
The generalized Lasso with non-linear observations
Y. Plan
Roman Vershynin
98
199
0
13 Feb 2015
A randomized algorithm for principal component analysis
A randomized algorithm for principal component analysis
V. Rokhlin
Arthur Szlam
M. Tygert
57
429
0
12 Sep 2008
1