ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1803.09522
  4. Cited By
A Provably Correct Algorithm for Deep Learning that Actually Works

A Provably Correct Algorithm for Deep Learning that Actually Works

26 March 2018
Eran Malach
Shai Shalev-Shwartz
    MLT
ArXivPDFHTML

Papers citing "A Provably Correct Algorithm for Deep Learning that Actually Works"

9 / 9 papers shown
Title
Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Scaling Laws and Representation Learning in Simple Hierarchical Languages: Transformers vs. Convolutional Architectures
Francesco Cagnetta
Alessandro Favero
Antonio Sclocchi
M. Wyart
30
0
0
11 May 2025
Learning curves theory for hierarchically compositional data with power-law distributed features
Learning curves theory for hierarchically compositional data with power-law distributed features
Francesco Cagnetta
Hyunmo Kang
M. Wyart
38
0
0
11 May 2025
Probing the Latent Hierarchical Structure of Data via Diffusion Models
Probing the Latent Hierarchical Structure of Data via Diffusion Models
Antonio Sclocchi
Alessandro Favero
Noam Itzhak Levi
M. Wyart
DiffM
35
3
0
17 Oct 2024
Training Deep Architectures Without End-to-End Backpropagation: A Survey
  on the Provably Optimal Methods
Training Deep Architectures Without End-to-End Backpropagation: A Survey on the Provably Optimal Methods
Shiyu Duan
José C. Príncipe
MQ
38
3
0
09 Jan 2021
Why Layer-Wise Learning is Hard to Scale-up and a Possible Solution via
  Accelerated Downsampling
Why Layer-Wise Learning is Hard to Scale-up and a Possible Solution via Accelerated Downsampling
Wenchi Ma
Miao Yu
Kaidong Li
Guanghui Wang
14
5
0
15 Oct 2020
Computational Separation Between Convolutional and Fully-Connected
  Networks
Computational Separation Between Convolutional and Fully-Connected Networks
Eran Malach
Shai Shalev-Shwartz
24
26
0
03 Oct 2020
From Boltzmann Machines to Neural Networks and Back Again
From Boltzmann Machines to Neural Networks and Back Again
Surbhi Goel
Adam R. Klivans
Frederic Koehler
19
5
0
25 Jul 2020
Training Neural Networks with Local Error Signals
Training Neural Networks with Local Error Signals
Arild Nøkland
L. Eidnes
32
226
0
20 Jan 2019
End-to-end Learning of a Convolutional Neural Network via Deep Tensor
  Decomposition
End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition
Samet Oymak
Mahdi Soltanolkotabi
21
12
0
16 May 2018
1