ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.01397
  4. Cited By
RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting
  and Output Merging

RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging

30 September 2021
Edouard Yvinec
Arnaud Dapogny
Matthieu Cord
Kévin Bailly
ArXivPDFHTML

Papers citing "RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging"

15 / 15 papers shown
Title
Training-Free Restoration of Pruned Neural Networks
Training-Free Restoration of Pruned Neural Networks
Keonho Lee
Minsoo Kim
Dong-Wan Choi
54
0
0
06 Feb 2025
Efficient Deep Learning Infrastructures for Embedded Computing Systems:
  A Comprehensive Survey and Future Envision
Efficient Deep Learning Infrastructures for Embedded Computing Systems: A Comprehensive Survey and Future Envision
Xiangzhong Luo
Di Liu
Hao Kong
Shuo Huai
Hui Chen
Guochu Xiong
Weichen Liu
31
2
0
03 Nov 2024
Pruner-Zero: Evolving Symbolic Pruning Metric from scratch for Large
  Language Models
Pruner-Zero: Evolving Symbolic Pruning Metric from scratch for Large Language Models
Peijie Dong
Lujun Li
Zhenheng Tang
Xiang Liu
Xinglin Pan
Qiang-qiang Wang
Xiaowen Chu
60
23
0
05 Jun 2024
STAT: Shrinking Transformers After Training
STAT: Shrinking Transformers After Training
Megan Flynn
Alexander Wang
Dean Edward Alvarez
Christopher De Sa
Anil Damle
36
2
0
29 May 2024
Optimizing Convolutional Neural Network Architecture
Optimizing Convolutional Neural Network Architecture
Luis Balderas
Miguel Lastra
José M. Benítez
CVBM
20
4
0
17 Dec 2023
Archtree: on-the-fly tree-structured exploration for latency-aware
  pruning of deep neural networks
Archtree: on-the-fly tree-structured exploration for latency-aware pruning of deep neural networks
Rémi Ouazan Reboul
Edouard Yvinec
Arnaud Dapogny
Kévin Bailly
15
0
0
17 Nov 2023
Network Memory Footprint Compression Through Jointly Learnable Codebooks
  and Mappings
Network Memory Footprint Compression Through Jointly Learnable Codebooks and Mappings
Vittorio Giammarino
Arnaud Dapogny
Kévin Bailly
MQ
22
1
0
29 Sep 2023
Approximate Computing Survey, Part II: Application-Specific & Architectural Approximation Techniques and Applications
Approximate Computing Survey, Part II: Application-Specific & Architectural Approximation Techniques and Applications
Vasileios Leon
Muhammad Abdullah Hanif
Giorgos Armeniakos
Xun Jiao
Muhammad Shafique
K. Pekmestzi
Dimitrios Soudris
37
3
0
20 Jul 2023
Designing strong baselines for ternary neural network quantization
  through support and mass equalization
Designing strong baselines for ternary neural network quantization through support and mass equalization
Edouard Yvinec
Arnaud Dapogny
Kévin Bailly
MQ
25
0
0
30 Jun 2023
LLM-Pruner: On the Structural Pruning of Large Language Models
LLM-Pruner: On the Structural Pruning of Large Language Models
Xinyin Ma
Gongfan Fang
Xinchao Wang
30
364
0
19 May 2023
Structured Pruning for Deep Convolutional Neural Networks: A survey
Structured Pruning for Deep Convolutional Neural Networks: A survey
Yang He
Lingao Xiao
3DPC
30
117
0
01 Mar 2023
Matching DNN Compression and Cooperative Training with Resources and
  Data Availability
Matching DNN Compression and Cooperative Training with Resources and Data Availability
F. Malandrino
G. Giacomo
Armin Karamzade
Marco Levorato
C. Chiasserini
37
9
0
02 Dec 2022
SCOP: Scientific Control for Reliable Neural Network Pruning
SCOP: Scientific Control for Reliable Neural Network Pruning
Yehui Tang
Yunhe Wang
Yixing Xu
Dacheng Tao
Chunjing Xu
Chao Xu
Chang Xu
AAML
50
166
0
21 Oct 2020
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,027
0
06 Mar 2020
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
224
383
0
05 Mar 2020
1