ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.10549
  4. Cited By
Archtree: on-the-fly tree-structured exploration for latency-aware
  pruning of deep neural networks

Archtree: on-the-fly tree-structured exploration for latency-aware pruning of deep neural networks

17 November 2023
Rémi Ouazan Reboul
Edouard Yvinec
Arnaud Dapogny
Kévin Bailly
ArXivPDFHTML

Papers citing "Archtree: on-the-fly tree-structured exploration for latency-aware pruning of deep neural networks"

3 / 3 papers shown
Title
SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance
SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance
Edouard Yvinec
Arnaud Dapogny
Matthieu Cord
Kévin Bailly
42
9
0
08 Jul 2022
Hessian-Aware Pruning and Optimal Neural Implant
Hessian-Aware Pruning and Optimal Neural Implant
Shixing Yu
Z. Yao
A. Gholami
Zhen Dong
Sehoon Kim
Michael W. Mahoney
Kurt Keutzer
54
59
0
22 Jan 2021
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Comparing Rewinding and Fine-tuning in Neural Network Pruning
Alex Renda
Jonathan Frankle
Michael Carbin
232
383
0
05 Mar 2020
1