ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.10710
  4. Cited By
Pre-defined Sparsity for Low-Complexity Convolutional Neural Networks

Pre-defined Sparsity for Low-Complexity Convolutional Neural Networks

29 January 2020
Souvik Kundu
M. Nazemi
Massoud Pedram
K. Chugg
P. Beerel
    CVBM
ArXivPDFHTML

Papers citing "Pre-defined Sparsity for Low-Complexity Convolutional Neural Networks"

6 / 6 papers shown
Title
STen: Productive and Efficient Sparsity in PyTorch
STen: Productive and Efficient Sparsity in PyTorch
Andrei Ivanov
Nikoli Dryden
Tal Ben-Nun
Saleh Ashkboos
Torsten Hoefler
34
4
0
15 Apr 2023
HIRE-SNN: Harnessing the Inherent Robustness of Energy-Efficient Deep
  Spiking Neural Networks by Training with Crafted Input Noise
HIRE-SNN: Harnessing the Inherent Robustness of Energy-Efficient Deep Spiking Neural Networks by Training with Crafted Input Noise
Souvik Kundu
Massoud Pedram
P. Beerel
AAML
19
71
0
06 Oct 2021
Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike
  Hybrid Input Encoding
Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding
Gourav Datta
Souvik Kundu
P. Beerel
48
27
0
26 Jul 2021
AttentionLite: Towards Efficient Self-Attention Models for Vision
AttentionLite: Towards Efficient Self-Attention Models for Vision
Souvik Kundu
Sairam Sundaresan
16
22
0
21 Dec 2020
Pre-Defined Sparse Neural Networks with Hardware Acceleration
Pre-Defined Sparse Neural Networks with Hardware Acceleration
Sourya Dey
Kuan-Wen Huang
P. Beerel
K. Chugg
41
24
0
04 Dec 2018
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
337
1,049
0
10 Feb 2017
1