ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.05799
  4. Cited By
Horovod: fast and easy distributed deep learning in TensorFlow
v1v2v3 (latest)

Horovod: fast and easy distributed deep learning in TensorFlow

15 February 2018
Alexander Sergeev
Mike Del Balso
ArXiv (abs)PDFHTMLGithub (14494★)

Papers citing "Horovod: fast and easy distributed deep learning in TensorFlow"

4 / 454 papers shown
Title
Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural
  Network Training
Parameter Hub: a Rack-Scale Parameter Server for Distributed Deep Neural Network Training
Liang Luo
Jacob Nelson
Luis Ceze
Amar Phanishayee
Arvind Krishnamurthy
154
121
0
21 May 2018
Spark-MPI: Approaching the Fifth Paradigm of Cognitive Applications
Spark-MPI: Approaching the Fifth Paradigm of Cognitive Applications
N. Malitsky
R. Castain
Matt Cowan
60
7
0
16 May 2018
TicTac: Accelerating Distributed Deep Learning with Communication
  Scheduling
TicTac: Accelerating Distributed Deep Learning with Communication Scheduling
Sayed Hadi Hashemi
Sangeetha Abdu Jyothi
R. Campbell
70
200
0
08 Mar 2018
Ray: A Distributed Framework for Emerging AI Applications
Ray: A Distributed Framework for Emerging AI Applications
Philipp Moritz
Robert Nishihara
Stephanie Wang
Alexey Tumanov
Richard Liaw
...
Melih Elibol
Zongheng Yang
William Paul
Michael I. Jordan
Ion Stoica
GNN
166
1,269
0
16 Dec 2017
Previous
123...1089