ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.04432
  4. Cited By
$L_0$-ARM: Network Sparsification via Stochastic Binary Optimization

L0L_0L0​-ARM: Network Sparsification via Stochastic Binary Optimization

9 April 2019
Yang Li
Shihao Ji
    MQ
ArXivPDFHTML

Papers citing "$L_0$-ARM: Network Sparsification via Stochastic Binary Optimization"

4 / 4 papers shown
Title
Contextual Dropout: An Efficient Sample-Dependent Dropout Module
Contextual Dropout: An Efficient Sample-Dependent Dropout Module
Xinjie Fan
Shujian Zhang
Korawat Tanwisuth
Xiaoning Qian
Mingyuan Zhou
OOD
BDL
UQCV
30
27
0
06 Mar 2021
Dirichlet Pruning for Neural Network Compression
Dirichlet Pruning for Neural Network Compression
Kamil Adamczewski
Mijung Park
27
3
0
10 Nov 2020
Resource-Efficient Neural Networks for Embedded Systems
Resource-Efficient Neural Networks for Embedded Systems
Wolfgang Roth
Günther Schindler
Lukas Pfeifenberger
Robert Peharz
Sebastian Tschiatschek
Holger Fröning
Franz Pernkopf
Zoubin Ghahramani
34
47
0
07 Jan 2020
Learning with Multiplicative Perturbations
Learning with Multiplicative Perturbations
Xiulong Yang
Shihao Ji
AAML
30
4
0
04 Dec 2019
1