ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.16890
  4. Cited By
NEPENTHE: Entropy-Based Pruning as a Neural Network Depth's Reducer

NEPENTHE: Entropy-Based Pruning as a Neural Network Depth's Reducer

24 April 2024
Zhu Liao
Victor Quétu
Van-Tam Nguyen
Enzo Tartaglione
ArXivPDFHTML

Papers citing "NEPENTHE: Entropy-Based Pruning as a Neural Network Depth's Reducer"

6 / 6 papers shown
Title
Entropy-Based Block Pruning for Efficient Large Language Models
Entropy-Based Block Pruning for Efficient Large Language Models
Liangwei Yang
Yuhui Xu
Juntao Tan
Doyen Sahoo
Shri Kiran Srinivasan
Caiming Xiong
Han Wang
Shelby Heinecke
AAML
30
0
0
04 Apr 2025
LaCoOT: Layer Collapse through Optimal Transport
LaCoOT: Layer Collapse through Optimal Transport
Victor Quétu
Nour Hezbri
Enzo Tartaglione
34
0
0
13 Jun 2024
DSD$^2$: Can We Dodge Sparse Double Descent and Compress the Neural
  Network Worry-Free?
DSD2^22: Can We Dodge Sparse Double Descent and Compress the Neural Network Worry-Free?
Victor Quétu
Enzo Tartaglione
32
7
0
02 Mar 2023
CMT: Convolutional Neural Networks Meet Vision Transformers
CMT: Convolutional Neural Networks Meet Vision Transformers
Jianyuan Guo
Kai Han
Han Wu
Yehui Tang
Chunjing Xu
Yunhe Wang
Chang Xu
ViT
351
500
0
13 Jul 2021
SeReNe: Sensitivity based Regularization of Neurons for Structured
  Sparsity in Neural Networks
SeReNe: Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks
Enzo Tartaglione
Andrea Bragagnolo
Francesco Odierna
Attilio Fiandrotti
Marco Grangetto
40
18
0
07 Feb 2021
What is the State of Neural Network Pruning?
What is the State of Neural Network Pruning?
Davis W. Blalock
Jose Javier Gonzalez Ortiz
Jonathan Frankle
John Guttag
191
1,032
0
06 Mar 2020
1