ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.11925
  4. Cited By
The Polynomial Method is Universal for Distribution-Free Correlational
  SQ Learning
v1v2v3 (latest)

The Polynomial Method is Universal for Distribution-Free Correlational SQ Learning

22 October 2020
Aravind Gollakota
Sushrut Karmalkar
Adam R. Klivans
ArXiv (abs)PDFHTML

Papers citing "The Polynomial Method is Universal for Distribution-Free Correlational SQ Learning"

14 / 14 papers shown
Title
The Optimality of Polynomial Regression for Agnostic Learning under
  Gaussian Marginals
The Optimality of Polynomial Regression for Agnostic Learning under Gaussian Marginals
Ilias Diakonikolas
D. Kane
Thanasis Pittas
Nikos Zarifis
43
50
0
08 Feb 2021
When Hardness of Approximation Meets Hardness of Learning
When Hardness of Approximation Meets Hardness of Learning
Eran Malach
Shai Shalev-Shwartz
52
9
0
18 Aug 2020
Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and
  ReLUs under Gaussian Marginals
Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs under Gaussian Marginals
Ilias Diakonikolas
D. Kane
Nikos Zarifis
66
66
0
29 Jun 2020
Algorithms and SQ Lower Bounds for PAC Learning One-Hidden-Layer ReLU
  Networks
Algorithms and SQ Lower Bounds for PAC Learning One-Hidden-Layer ReLU Networks
Ilias Diakonikolas
D. Kane
Vasilis Kontonis
Nikos Zarifis
69
66
0
22 Jun 2020
Statistical Queries and Statistical Algorithms: Foundations and
  Applications
Statistical Queries and Statistical Algorithms: Foundations and Applications
L. Reyzin
33
28
0
01 Apr 2020
Approximate is Good Enough: Probabilistic Variants of Dimensional and
  Margin Complexity
Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity
Pritish Kamath
Omar Montasser
Nathan Srebro
48
29
0
09 Mar 2020
Failures of Gradient-Based Deep Learning
Failures of Gradient-Based Deep Learning
Shai Shalev-Shwartz
Ohad Shamir
Shaked Shammah
ODLUQCV
121
201
0
23 Mar 2017
Distribution-Specific Hardness of Learning Neural Networks
Distribution-Specific Hardness of Learning Neural Networks
Ohad Shamir
89
117
0
05 Sep 2016
A General Characterization of the Statistical Query Complexity
A General Characterization of the Statistical Query Complexity
Vitaly Feldman
76
53
0
07 Aug 2016
Complexity Theoretic Limitations on Learning Halfspaces
Complexity Theoretic Limitations on Learning Halfspaces
Amit Daniely
135
141
0
21 May 2015
Approximate resilience, monotonicity, and the complexity of agnostic
  learning
Approximate resilience, monotonicity, and the complexity of agnostic learning
Dana Dachman-Soled
Vitaly Feldman
Li-Yang Tan
Andrew Wan
K. Wimmer
67
31
0
21 May 2014
Complexity theoretic limitations on learning DNF's
Complexity theoretic limitations on learning DNF's
Amit Daniely
Shai Shalev-Shwartz
97
112
0
13 Apr 2014
From average case complexity to improper learning complexity
From average case complexity to improper learning complexity
Amit Daniely
N. Linial
Shai Shalev-Shwartz
144
120
0
10 Nov 2013
A Complete Characterization of Statistical Query Learning with
  Applications to Evolvability
A Complete Characterization of Statistical Query Learning with Applications to Evolvability
Vitaly Feldman
110
77
0
16 Feb 2010
1