ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1807.02581
  4. Cited By
The Goldilocks zone: Towards better understanding of neural network loss
  landscapes
v1v2 (latest)

The Goldilocks zone: Towards better understanding of neural network loss landscapes

6 July 2018
Stanislav Fort
Adam Scherlis
ArXiv (abs)PDFHTML

Papers citing "The Goldilocks zone: Towards better understanding of neural network loss landscapes"

3 / 3 papers shown
Title
High-entropy Advantage in Neural Networks' Generalizability
High-entropy Advantage in Neural Networks' Generalizability
Entao Yang
Wei Wei
Yue Shang
Ge Zhang
AI4CE
102
0
0
17 Mar 2025
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
427
2,941
0
15 Sep 2016
Qualitatively characterizing neural network optimization problems
Qualitatively characterizing neural network optimization problems
Ian Goodfellow
Oriol Vinyals
Andrew M. Saxe
ODL
110
523
0
19 Dec 2014
1