ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.11763
  4. Cited By
Enhancing Split Computing and Early Exit Applications through Predefined
  Sparsity

Enhancing Split Computing and Early Exit Applications through Predefined Sparsity

16 July 2024
Luigi Capogrosso
Enrico Fraccaroli
Giulio Petrozziello
Francesco Setti
Samarjit Chakraborty
Franco Fummi
Marco Cristani
ArXivPDFHTML

Papers citing "Enhancing Split Computing and Early Exit Applications through Predefined Sparsity"

3 / 3 papers shown
Title
I-SPLIT: Deep Network Interpretability for Split Computing
I-SPLIT: Deep Network Interpretability for Split Computing
Federico Cunico
Luigi Capogrosso
Francesco Setti
D. Carra
Franco Fummi
Marco Cristani
35
14
0
23 Sep 2022
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
Tailin Liang
C. Glossner
Lei Wang
Shaobo Shi
Xiaotong Zhang
MQ
150
675
0
24 Jan 2021
Pre-Defined Sparse Neural Networks with Hardware Acceleration
Pre-Defined Sparse Neural Networks with Hardware Acceleration
Sourya Dey
Kuan-Wen Huang
P. Beerel
K. Chugg
41
24
0
04 Dec 2018
1