ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.12484
  4. Cited By
Enabling Large Batch Size Training for DNN Models Beyond the Memory
  Limit While Maintaining Performance

Enabling Large Batch Size Training for DNN Models Beyond the Memory Limit While Maintaining Performance

24 October 2021
Nathanaël Fijalkow
DoangJoo Synn
Jooyoung Park
Jong-Kook Kim
ArXivPDFHTML

Papers citing "Enabling Large Batch Size Training for DNN Models Beyond the Memory Limit While Maintaining Performance"

2 / 2 papers shown
Title
CleanUMamba: A Compact Mamba Network for Speech Denoising using Channel Pruning
CleanUMamba: A Compact Mamba Network for Speech Denoising using Channel Pruning
Sjoerd Groot
Qinyu Chen
Jan C. van Gemert
Chang Gao
Mamba
123
0
0
14 Oct 2024
Bottleneck Transformers for Visual Recognition
Bottleneck Transformers for Visual Recognition
A. Srinivas
Tsung-Yi Lin
Niki Parmar
Jonathon Shlens
Pieter Abbeel
Ashish Vaswani
SLR
290
979
0
27 Jan 2021
1