ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07215
31
5
v1v2 (latest)

STANNIS: Low-Power Acceleration of Deep Neural Network Training Using Computational Storage

17 February 2020
Ali Heydarigorji
Mahdi Torabzadehkashi
Siavash Rezaei
Hossein Bobarshad
V. Alves
Pai H. Chou
    BDL
ArXiv (abs)PDFHTML
Abstract

This paper proposes a framework for distributed, in-storage training of neural networks on clusters of computational storage devices. Such devices not only contain hardware accelerators but also eliminate data movement between the host and storage, resulting in both improved performance and power savings. More importantly, this in-storage processing style of training ensures that private data never leaves the storage while fully controlling the sharing of public data. Experimental results show up to 2.7x speedup and 69% reduction in energy consumption and no significant loss in accuracy.

View on arXiv
Comments on this paper