Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1708.00077
Cited By
Bayesian Sparsification of Recurrent Neural Networks
31 July 2017
E. Lobacheva
Nadezhda Chirkova
Dmitry Vetrov
UQCV
BDL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Bayesian Sparsification of Recurrent Neural Networks"
7 / 7 papers shown
Title
FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout
Jingjing Xue
Min Liu
Sheng Sun
Yuwei Wang
Hui Jiang
Xue Jiang
21
7
0
14 Jul 2023
Machine Learning Methods for Spectral Efficiency Prediction in Massive MIMO Systems
E. Bobrov
Sergey Troshin
Nadezhda Chirkova
E. Lobacheva
Sviatoslav Panchenko
Dmitry Vetrov
Dmitry Kropotov Lomonosov Msu
11
3
0
29 Dec 2021
Spectral Pruning for Recurrent Neural Networks
Takashi Furuya
Kazuma Suetake
K. Taniguchi
Hiroyuki Kusumoto
Ryuji Saiin
Tomohiro Daimon
27
4
0
23 May 2021
Intrinsically Sparse Long Short-Term Memory Networks
Shiwei Liu
Decebal Constantin Mocanu
Mykola Pechenizkiy
30
9
0
26 Jan 2019
GroupReduce: Block-Wise Low-Rank Approximation for Neural Language Model Shrinking
Patrick H. Chen
Si Si
Yang Li
Ciprian Chelba
Cho-Jui Hsieh
21
67
0
18 Jun 2018
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhiwen Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
718
6,748
0
26 Sep 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
287
9,156
0
06 Jun 2015
1