ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.08597
  4. Cited By
Enabling Sparse Winograd Convolution by Native Pruning

Enabling Sparse Winograd Convolution by Native Pruning

28 February 2017
Sheng Li
Jongsoo Park
P. T. P. Tang
ArXivPDFHTML

Papers citing "Enabling Sparse Winograd Convolution by Native Pruning"

5 / 5 papers shown
Title
Learning Structured Sparsity in Deep Neural Networks
Learning Structured Sparsity in Deep Neural Networks
W. Wen
Chunpeng Wu
Yandan Wang
Yiran Chen
Hai Helen Li
164
2,337
0
12 Aug 2016
Learning both Weights and Connections for Efficient Neural Networks
Learning both Weights and Connections for Efficient Neural Networks
Song Han
Jeff Pool
J. Tran
W. Dally
CVBM
296
6,660
0
08 Jun 2015
Fast ConvNets Using Group-wise Brain Damage
Fast ConvNets Using Group-wise Brain Damage
V. Lebedev
Victor Lempitsky
AAML
163
449
0
08 Jun 2015
Fast Convolutional Nets With fbfft: A GPU Performance Evaluation
Fast Convolutional Nets With fbfft: A GPU Performance Evaluation
Nicolas Vasilache
Jeff Johnson
Michaël Mathieu
Soumith Chintala
Serkan Piantino
Yann LeCun
50
347
0
24 Dec 2014
Caffe: Convolutional Architecture for Fast Feature Embedding
Caffe: Convolutional Architecture for Fast Feature Embedding
Yangqing Jia
Evan Shelhamer
Jeff Donahue
Sergey Karayev
Jonathan Long
Ross B. Girshick
S. Guadarrama
Trevor Darrell
VLM
BDL
3DV
259
14,704
0
20 Jun 2014
1