ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.03453
  4. Cited By
$S^3$: Sign-Sparse-Shift Reparametrization for Effective Training of
  Low-bit Shift Networks

S3S^3S3: Sign-Sparse-Shift Reparametrization for Effective Training of Low-bit Shift Networks

7 July 2021
Xinlin Li
Bang Liu
Yaoliang Yu
Wulong Liu
Chunjing Xu
V. Nia
    MQ
ArXivPDFHTML

Papers citing "$S^3$: Sign-Sparse-Shift Reparametrization for Effective Training of Low-bit Shift Networks"

4 / 4 papers shown
Title
Standard Deviation-Based Quantization for Deep Neural Networks
Standard Deviation-Based Quantization for Deep Neural Networks
Amir Ardakani
A. Ardakani
B. Meyer
J. Clark
W. Gross
MQ
49
1
0
24 Feb 2022
ShiftAddNet: A Hardware-Inspired Deep Network
ShiftAddNet: A Hardware-Inspired Deep Network
Haoran You
Xiaohan Chen
Yongan Zhang
Chaojian Li
Sicheng Li
Zihao Liu
Zhangyang Wang
Yingyan Lin
OOD
MQ
73
76
0
24 Oct 2020
Forward and Backward Information Retention for Accurate Binary Neural
  Networks
Forward and Backward Information Retention for Accurate Binary Neural Networks
Haotong Qin
Ruihao Gong
Xianglong Liu
Mingzhu Shen
Ziran Wei
F. Yu
Jingkuan Song
MQ
131
324
0
24 Sep 2019
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
322
1,049
0
10 Feb 2017
1