ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.06037
18
1

SoftNeuro: Fast Deep Inference using Multi-platform Optimization

12 October 2021
Masaki Hilaga
Yasuhiro Kuroda
Hitoshi Matsuo
Tatsuya Kawaguchi
Gabriel Ogawa
Hiroshi Miyake
Yusuke Kozawa
ArXivPDFHTML
Abstract

Faster inference of deep learning models is highly demanded on edge devices and even servers, for both financial and environmental reasons. To address this issue, we propose SoftNeuro, a novel, high-performance inference framework with efficient performance tuning. The key idea is to separate algorithmic routines from network layers. Our framework maximizes the inference performance by profiling various routines for each layer and selecting the fastest path. To efficiently find the best path, we propose a routine-selection algorithm based on dynamic programming. Experiments show that the proposed framework achieves both fast inference and efficient tuning.

View on arXiv
Comments on this paper