ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.05529
  4. Cited By
Recurrence of Optimum for Training Weight and Activation Quantized
  Networks

Recurrence of Optimum for Training Weight and Activation Quantized Networks

10 December 2020
Ziang Long
Penghang Yin
Jack Xin
    MQ
ArXivPDFHTML

Papers citing "Recurrence of Optimum for Training Weight and Activation Quantized Networks"

4 / 4 papers shown
Title
Frame Quantization of Neural Networks
Frame Quantization of Neural Networks
Wojciech Czaja
Sanghoon Na
32
1
0
11 Apr 2024
Feature Affinity Assisted Knowledge Distillation and Quantization of
  Deep Neural Networks on Label-Free Data
Feature Affinity Assisted Knowledge Distillation and Quantization of Deep Neural Networks on Label-Free Data
Zhijian Li
Biao Yang
Penghang Yin
Y. Qi
Jack Xin
MQ
12
1
0
10 Feb 2023
An Integrated Approach to Produce Robust Models with High Efficiency
An Integrated Approach to Produce Robust Models with High Efficiency
Zhijian Li
Bao Wang
Jack Xin
MQ
AAML
28
3
0
31 Aug 2020
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
337
1,049
0
10 Feb 2017
1