ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12184
  4. Cited By
A New Perspective for Understanding Generalization Gap of Deep Neural
  Networks Trained with Large Batch Sizes

A New Perspective for Understanding Generalization Gap of Deep Neural Networks Trained with Large Batch Sizes

21 October 2022
O. Oyedotun
Konstantinos Papadopoulos
Djamila Aouada
    AI4CE
ArXivPDFHTML

Papers citing "A New Perspective for Understanding Generalization Gap of Deep Neural Networks Trained with Large Batch Sizes"

4 / 4 papers shown
Title
Novel Deep Neural Network Classifier Characterization Metrics with
  Applications to Dataless Evaluation
Novel Deep Neural Network Classifier Characterization Metrics with Applications to Dataless Evaluation
Nathaniel R. Dean
Dilip Sarkar
35
0
0
17 Jul 2024
A novel multi-scale loss function for classification problems in machine
  learning
A novel multi-scale loss function for classification problems in machine learning
L. Berlyand
Robert Creese
P. Jabin
17
3
0
04 Jun 2021
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
297
10,216
0
16 Nov 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,888
0
15 Sep 2016
1