Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1811.03060
Cited By
FLOPs as a Direct Optimization Objective for Learning Sparse Neural Networks
7 November 2018
Gautam Bhattacharya
Ashutosh Adhikari
Md. Jahangir Alam
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FLOPs as a Direct Optimization Objective for Learning Sparse Neural Networks"
6 / 6 papers shown
Title
Kolmogorov-Arnold Fourier Networks
Jusheng Zhang
Yijia Fan
Kaitong Cai
Keze Wang
68
0
0
09 Feb 2025
Practical Conformer: Optimizing size, speed and flops of Conformer for on-Device and cloud ASR
Rami Botros
Anmol Gulati
Tara N. Sainath
K. Choromanski
Ruoming Pang
Trevor Strohman
Weiran Wang
Jiahui Yu
MQ
28
3
0
31 Mar 2023
Dirichlet Pruning for Neural Network Compression
Kamil Adamczewski
Mijung Park
27
3
0
10 Nov 2020
Model Pruning Enables Efficient Federated Learning on Edge Devices
Yuang Jiang
Shiqiang Wang
Victor Valls
Bongjun Ko
Wei-Han Lee
Kin K. Leung
Leandros Tassiulas
38
447
0
26 Sep 2019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
Raphael Tang
Yao Lu
Linqing Liu
Lili Mou
Olga Vechtomova
Jimmy J. Lin
32
417
0
28 Mar 2019
Fast On-the-fly Retraining-free Sparsification of Convolutional Neural Networks
Amir H. Ashouri
T. Abdelrahman
Alwyn Dos Remedios
MQ
16
12
0
10 Nov 2018
1