ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.12840
  4. Cited By
Efficiently Learning One-Hidden-Layer ReLU Networks via Schur
  Polynomials

Efficiently Learning One-Hidden-Layer ReLU Networks via Schur Polynomials

24 July 2023
Ilias Diakonikolas
D. Kane
ArXivPDFHTML

Papers citing "Efficiently Learning One-Hidden-Layer ReLU Networks via Schur Polynomials"

4 / 4 papers shown
Title
Gradient dynamics for low-rank fine-tuning beyond kernels
Gradient dynamics for low-rank fine-tuning beyond kernels
Arif Kerem Dayi
Sitan Chen
74
1
0
23 Nov 2024
Agnostically Learning Multi-index Models with Queries
Agnostically Learning Multi-index Models with Queries
Ilias Diakonikolas
Daniel M. Kane
Vasilis Kontonis
Christos Tzamos
Nikos Zarifis
19
4
0
27 Dec 2023
A faster and simpler algorithm for learning shallow networks
A faster and simpler algorithm for learning shallow networks
Sitan Chen
Shyam Narayanan
38
7
0
24 Jul 2023
ReLU Regression with Massart Noise
ReLU Regression with Massart Noise
Ilias Diakonikolas
Jongho Park
Christos Tzamos
56
11
0
10 Sep 2021
1