ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.01077
  4. Cited By
Learning Sparse & Ternary Neural Networks with Entropy-Constrained
  Trained Ternarization (EC2T)

Learning Sparse & Ternary Neural Networks with Entropy-Constrained Trained Ternarization (EC2T)

2 April 2020
Arturo Marbán
Daniel Becking
Simon Wiedemann
Wojciech Samek
    MQ
ArXivPDFHTML

Papers citing "Learning Sparse & Ternary Neural Networks with Entropy-Constrained Trained Ternarization (EC2T)"

3 / 3 papers shown
Title
RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting
  and Output Merging
RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging
Edouard Yvinec
Arnaud Dapogny
Matthieu Cord
Kévin Bailly
28
15
0
30 Sep 2021
FantastIC4: A Hardware-Software Co-Design Approach for Efficiently
  Running 4bit-Compact Multilayer Perceptrons
FantastIC4: A Hardware-Software Co-Design Approach for Efficiently Running 4bit-Compact Multilayer Perceptrons
Simon Wiedemann
Suhas Shivapakash
P. Wiedemann
Daniel Becking
Wojciech Samek
F. Gerfers
Thomas Wiegand
MQ
23
7
0
17 Dec 2020
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,027
0
06 Mar 2020
1