ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.06257
  4. Cited By
The Power of Sparsity in Convolutional Neural Networks

The Power of Sparsity in Convolutional Neural Networks

21 February 2017
Soravit Changpinyo
Mark Sandler
A. Zhmoginov
ArXivPDFHTML

Papers citing "The Power of Sparsity in Convolutional Neural Networks"

23 / 23 papers shown
Title
Poly-MgNet: Polynomial Building Blocks in Multigrid-Inspired ResNets
Antonia van Betteray
Matthias Rottmann
Karsten Kahl
51
0
0
13 Mar 2025
Statistical guarantees for sparse deep learning
Statistical guarantees for sparse deep learning
Johannes Lederer
24
11
0
11 Dec 2022
MGiaD: Multigrid in all dimensions. Efficiency and robustness by
  coarsening in resolution and channel dimensions
MGiaD: Multigrid in all dimensions. Efficiency and robustness by coarsening in resolution and channel dimensions
Antonia van Betteray
Matthias Rottmann
Karsten Kahl
31
2
0
10 Nov 2022
SBPF: Sensitiveness Based Pruning Framework For Convolutional Neural
  Network On Image Classification
SBPF: Sensitiveness Based Pruning Framework For Convolutional Neural Network On Image Classification
Yihe Lu
Maoguo Gong
Wei Zhao
Kaiyuan Feng
Hao Li
VLM
29
0
0
09 Aug 2022
Context-sensitive neocortical neurons transform the effectiveness and
  efficiency of neural information processing
Context-sensitive neocortical neurons transform the effectiveness and efficiency of neural information processing
Ahsan Adeel
Mario Franco
Mohsin Raza
K. Ahmed
31
9
0
15 Jul 2022
Two Sparsities Are Better Than One: Unlocking the Performance Benefits
  of Sparse-Sparse Networks
Two Sparsities Are Better Than One: Unlocking the Performance Benefits of Sparse-Sparse Networks
Kevin Lee Hunter
Lawrence Spracklen
Subutai Ahmad
28
20
0
27 Dec 2021
Training of deep residual networks with stochastic MG/OPT
Training of deep residual networks with stochastic MG/OPT
Cyrill Planta
Alena Kopanicáková
Rolf Krause
35
3
0
09 Aug 2021
Layer Sparsity in Neural Networks
Layer Sparsity in Neural Networks
Mohamed Hebiri
Johannes Lederer
36
10
0
28 Jun 2020
AdaDeep: A Usage-Driven, Automated Deep Model Compression Framework for
  Enabling Ubiquitous Intelligent Mobiles
AdaDeep: A Usage-Driven, Automated Deep Model Compression Framework for Enabling Ubiquitous Intelligent Mobiles
Sicong Liu
Junzhao Du
Kaiming Nan
Zimu Zhou
Zhangyang Wang
Yingyan Lin
32
30
0
08 Jun 2020
Deep-Aligned Convolutional Neural Network for Skeleton-based Action
  Recognition and Segmentation
Deep-Aligned Convolutional Neural Network for Skeleton-based Action Recognition and Segmentation
Babak Hosseini
Romain Montagne
Barbara Hammer
37
22
0
12 Nov 2019
s-LWSR: Super Lightweight Super-Resolution Network
s-LWSR: Super Lightweight Super-Resolution Network
Biao Li
Jiabin Liu
Bo Wang
Zhiquan Qi
Yong Shi
SupR
34
47
0
24 Sep 2019
Patient Knowledge Distillation for BERT Model Compression
Patient Knowledge Distillation for BERT Model Compression
S. Sun
Yu Cheng
Zhe Gan
Jingjing Liu
78
832
0
25 Aug 2019
Implicit Deep Learning
Implicit Deep Learning
L. Ghaoui
Fangda Gu
Bertrand Travacca
Armin Askari
Alicia Y. Tsai
AI4CE
34
176
0
17 Aug 2019
Compressing RNNs for IoT devices by 15-38x using Kronecker Products
Compressing RNNs for IoT devices by 15-38x using Kronecker Products
Urmish Thakker
Jesse G. Beu
Dibakar Gope
Chu Zhou
Igor Fedorov
Ganesh S. Dasika
Matthew Mattina
27
36
0
07 Jun 2019
ANTNets: Mobile Convolutional Neural Networks for Resource Efficient
  Image Classification
ANTNets: Mobile Convolutional Neural Networks for Resource Efficient Image Classification
Yunyang Xiong
Hyunwoo Kim
Varsha Hedau
48
19
0
07 Apr 2019
Model Slicing for Supporting Complex Analytics with Elastic Inference
  Cost and Resource Constraints
Model Slicing for Supporting Complex Analytics with Elastic Inference Cost and Resource Constraints
Shaofeng Cai
Gang Chen
Beng Chin Ooi
Jinyang Gao
25
19
0
03 Apr 2019
Optimally Scheduling CNN Convolutions for Efficient Memory Access
Optimally Scheduling CNN Convolutions for Efficient Memory Access
Arthur Stoutchinin
Francesco Conti
Luca Benini
38
43
0
04 Feb 2019
A Comprehensive guide to Bayesian Convolutional Neural Network with
  Variational Inference
A Comprehensive guide to Bayesian Convolutional Neural Network with Variational Inference
Kumar Shridhar
F. Laumann
Marcus Liwicki
BDL
UQCV
34
170
0
08 Jan 2019
Implementing Push-Pull Efficiently in GraphBLAS
Implementing Push-Pull Efficiently in GraphBLAS
Carl Yang
A. Buluç
John Douglas Owens
33
38
0
10 Apr 2018
Recurrent Residual Module for Fast Inference in Videos
Recurrent Residual Module for Fast Inference in Videos
Bowen Pan
Wuwei Lin
Xiaolin Fang
Chaoqin Huang
Bolei Zhou
Cewu Lu
ObjD
28
33
0
27 Feb 2018
Interleaved Group Convolutions for Deep Neural Networks
Interleaved Group Convolutions for Deep Neural Networks
Ting Zhang
Guo-Jun Qi
Bin Xiao
Jingdong Wang
36
81
0
10 Jul 2017
Speeding up Convolutional Neural Networks By Exploiting the Sparsity of
  Rectifier Units
Speeding up Convolutional Neural Networks By Exploiting the Sparsity of Rectifier Units
S. Shi
Xiaowen Chu
29
43
0
25 Apr 2017
Improving neural networks by preventing co-adaptation of feature
  detectors
Improving neural networks by preventing co-adaptation of feature detectors
Geoffrey E. Hinton
Nitish Srivastava
A. Krizhevsky
Ilya Sutskever
Ruslan Salakhutdinov
VLM
266
7,640
0
03 Jul 2012
1