ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.06196
  4. Cited By
Multilevel Minimization for Deep Residual Networks

Multilevel Minimization for Deep Residual Networks

13 April 2020
Lisa Gaedke-Merzhäuser
Alena Kopanicáková
Rolf Krause
ArXivPDFHTML

Papers citing "Multilevel Minimization for Deep Residual Networks"

5 / 5 papers shown
Title
A multilevel approach to accelerate the training of Transformers
A multilevel approach to accelerate the training of Transformers
Guillaume Lauga
Maël Chaumette
Edgar Desainte-Maréville
Étienne Lasalle
Arthur Lebeurrier
AI4CE
45
0
0
24 Apr 2025
A Multi-Level Framework for Accelerating Training Transformer Models
A Multi-Level Framework for Accelerating Training Transformer Models
Longwei Zou
Han Zhang
Yangdong Deng
AI4CE
40
1
0
07 Apr 2024
Training of deep residual networks with stochastic MG/OPT
Training of deep residual networks with stochastic MG/OPT
Cyrill Planta
Alena Kopanicáková
Rolf Krause
29
3
0
09 Aug 2021
Layer-Parallel Training with GPU Concurrency of Deep Residual Neural
  Networks via Nonlinear Multigrid
Layer-Parallel Training with GPU Concurrency of Deep Residual Neural Networks via Nonlinear Multigrid
Andrew Kirby
S. Samsi
Michael Jones
Albert Reuther
J. Kepner
V. Gadepally
25
12
0
14 Jul 2020
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Z. Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
716
6,746
0
26 Sep 2016
1