ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.10095
  4. Cited By
Landscape Connectivity and Dropout Stability of SGD Solutions for
  Over-parameterized Neural Networks

Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks

20 December 2019
A. Shevchenko
Marco Mondelli
ArXivPDFHTML

Papers citing "Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks"

10 / 10 papers shown
Title
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime
Francesco Camilli
D. Tieplova
Eleonora Bergamin
Jean Barbier
114
0
0
06 May 2025
Mode Connectivity in Auction Design
Mode Connectivity in Auction Design
Christoph Hertrich
Yixin Tao
László A. Végh
18
1
0
18 May 2023
On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling
  Walks
On Quantum Speedups for Nonconvex Optimization via Quantum Tunneling Walks
Yizhou Liu
Weijie J. Su
Tongyang Li
24
17
0
29 Sep 2022
Sharp asymptotics on the compression of two-layer neural networks
Sharp asymptotics on the compression of two-layer neural networks
Mohammad Hossein Amani
Simone Bombari
Marco Mondelli
Rattana Pukdee
Stefano Rini
MLT
24
0
0
17 May 2022
Mode connectivity in the loss landscape of parameterized quantum
  circuits
Mode connectivity in the loss landscape of parameterized quantum circuits
Kathleen E. Hamilton
E. Lynn
R. Pooser
25
3
0
09 Nov 2021
Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks
Mean-field Analysis of Piecewise Linear Solutions for Wide ReLU Networks
A. Shevchenko
Vyacheslav Kungurtsev
Marco Mondelli
MLT
38
13
0
03 Nov 2021
Global Convergence of Three-layer Neural Networks in the Mean Field
  Regime
Global Convergence of Three-layer Neural Networks in the Mean Field Regime
H. Pham
Phan-Minh Nguyen
MLT
AI4CE
41
19
0
11 May 2021
Analyzing Monotonic Linear Interpolation in Neural Network Loss
  Landscapes
Analyzing Monotonic Linear Interpolation in Neural Network Loss Landscapes
James Lucas
Juhan Bae
Michael Ruogu Zhang
Stanislav Fort
R. Zemel
Roger C. Grosse
MoMe
164
28
0
22 Apr 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
183
1,185
0
30 Nov 2014
1