Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1912.04050
Cited By
PhoneBit: Efficient GPU-Accelerated Binary Neural Network Inference Engine for Mobile Phones
5 December 2019
Gang Chen
Shengyu He
Haitao Meng
Kai-Qi Huang
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"PhoneBit: Efficient GPU-Accelerated Binary Neural Network Inference Engine for Mobile Phones"
8 / 8 papers shown
Title
HG-Caffe: Mobile and Embedded Neural Network GPU (OpenCL) Inference Engine with FP16 Supporting
Zhuoran Ji
BDL
14
5
0
03 Jan 2019
AI Benchmark: Running Deep Neural Networks on Android Smartphones
Andrey D. Ignatov
Radu Timofte
William Chou
Ke Wang
Max Wu
Tim Hartley
Luc Van Gool
ELM
46
322
0
02 Oct 2018
Learning Efficient Convolutional Networks through Network Slimming
Zhuang Liu
Jianguo Li
Zhiqiang Shen
Gao Huang
Shoumeng Yan
Changshui Zhang
87
2,407
0
22 Aug 2017
BMXNet: An Open-Source Binary Neural Network Implementation Based on MXNet
Haojin Yang
Martin Fritzsche
Christian Bartz
Christoph Meinel
MQ
20
61
0
27 May 2017
Efficient Processing of Deep Neural Networks: A Tutorial and Survey
Vivienne Sze
Yu-hsin Chen
Tien-Ju Yang
J. Emer
AAML
3DV
76
3,002
0
27 Mar 2017
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
Mohammad Rastegari
Vicente Ordonez
Joseph Redmon
Ali Farhadi
MQ
116
4,342
0
16 Mar 2016
Binarized Neural Networks
Itay Hubara
Daniel Soudry
Ran El-Yaniv
MQ
67
1,349
0
08 Feb 2016
CNNdroid: GPU-Accelerated Execution of Trained Deep Convolutional Neural Networks on Android
Seyyed Salar Latifi Oskouei
Hossein Golestani
Matin Hashemi
S. Ghiasi
HAI
29
104
0
23 Nov 2015
1