ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.07426
  4. Cited By
Computational Complexity of Learning Neural Networks: Smoothness and
  Degeneracy

Computational Complexity of Learning Neural Networks: Smoothness and Degeneracy

15 February 2023
Amit Daniely
Nathan Srebro
Gal Vardi
ArXivPDFHTML

Papers citing "Computational Complexity of Learning Neural Networks: Smoothness and Degeneracy"

4 / 4 papers shown
Title
Exploration is Harder than Prediction: Cryptographically Separating
  Reinforcement Learning from Supervised Learning
Exploration is Harder than Prediction: Cryptographically Separating Reinforcement Learning from Supervised Learning
Noah Golowich
Ankur Moitra
Dhruv Rohatgi
OffRL
35
4
0
04 Apr 2024
Polynomial-Time Solutions for ReLU Network Training: A Complexity
  Classification via Max-Cut and Zonotopes
Polynomial-Time Solutions for ReLU Network Training: A Complexity Classification via Max-Cut and Zonotopes
Yifei Wang
Mert Pilanci
26
3
0
18 Nov 2023
Most Neural Networks Are Almost Learnable
Most Neural Networks Are Almost Learnable
Amit Daniely
Nathan Srebro
Gal Vardi
26
0
0
25 May 2023
From Local Pseudorandom Generators to Hardness of Learning
From Local Pseudorandom Generators to Hardness of Learning
Amit Daniely
Gal Vardi
109
30
0
20 Jan 2021
1