ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07215
  4. Cited By
STANNIS: Low-Power Acceleration of Deep Neural Network Training Using
  Computational Storage
v1v2 (latest)

STANNIS: Low-Power Acceleration of Deep Neural Network Training Using Computational Storage

17 February 2020
Ali Heydarigorji
Mahdi Torabzadehkashi
Siavash Rezaei
Hossein Bobarshad
V. Alves
Pai H. Chou
    BDL
ArXiv (abs)PDFHTML

Papers citing "STANNIS: Low-Power Acceleration of Deep Neural Network Training Using Computational Storage"

2 / 2 papers shown
Title
Towards a Scalable and Distributed Infrastructure for Deep Learning
  Applications
Towards a Scalable and Distributed Infrastructure for Deep Learning Applications
Bita Hasheminezhad
S. Shirzad
Nanmiao Wu
Patrick Diehl
Hannes Schulz
Hartmut Kaiser
GNNAI4CE
85
4
0
06 Oct 2020
HyperTune: Dynamic Hyperparameter Tuning For Efficient Distribution of
  DNN Training Over Heterogeneous Systems
HyperTune: Dynamic Hyperparameter Tuning For Efficient Distribution of DNN Training Over Heterogeneous Systems
Ali Heydarigorji
Siavash Rezaei
Mahdi Torabzadehkashi
Hossein Bobarshad
V. Alves
Pai H. Chou
84
14
0
16 Jul 2020
1