ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.04190
  4. Cited By
On the Power of Differentiable Learning versus PAC and SQ Learning

On the Power of Differentiable Learning versus PAC and SQ Learning

9 August 2021
Emmanuel Abbe
Pritish Kamath
Eran Malach
Colin Sandon
Nathan Srebro
    MLT
ArXivPDFHTML

Papers citing "On the Power of Differentiable Learning versus PAC and SQ Learning"

6 / 6 papers shown
Title
A Mathematical Model for Curriculum Learning for Parities
A Mathematical Model for Curriculum Learning for Parities
Elisabetta Cornacchia
Elchanan Mossel
34
10
0
31 Jan 2023
Recurrent Convolutional Neural Networks Learn Succinct Learning
  Algorithms
Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms
Surbhi Goel
Sham Kakade
Adam Tauman Kalai
Cyril Zhang
22
1
0
01 Sep 2022
Hidden Progress in Deep Learning: SGD Learns Parities Near the
  Computational Limit
Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit
Boaz Barak
Benjamin L. Edelman
Surbhi Goel
Sham Kakade
Eran Malach
Cyril Zhang
30
123
0
18 Jul 2022
Learning ReLU networks to high uniform accuracy is intractable
Learning ReLU networks to high uniform accuracy is intractable
Julius Berner
Philipp Grohs
F. Voigtlaender
32
4
0
26 May 2022
An initial alignment between neural network and target is needed for
  gradient descent to learn
An initial alignment between neural network and target is needed for gradient descent to learn
Emmanuel Abbe
Elisabetta Cornacchia
Jan Hązła
Christopher Marquis
24
16
0
25 Feb 2022
Random Feature Amplification: Feature Learning and Generalization in
  Neural Networks
Random Feature Amplification: Feature Learning and Generalization in Neural Networks
Spencer Frei
Niladri S. Chatterji
Peter L. Bartlett
MLT
30
29
0
15 Feb 2022
1