ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.10759
  4. Cited By
HadaNets: Flexible Quantization Strategies for Neural Networks

HadaNets: Flexible Quantization Strategies for Neural Networks

26 May 2019
Yash Akhauri
    MQ
ArXivPDFHTML

Papers citing "HadaNets: Flexible Quantization Strategies for Neural Networks"

4 / 4 papers shown
Title
A Hybrid Quantum-Classical Approach based on the Hadamard Transform for
  the Convolutional Layer
A Hybrid Quantum-Classical Approach based on the Hadamard Transform for the Convolutional Layer
Hongyi Pan
Xin Zhu
S. Atici
Ahmet Enis Cetin
16
17
0
27 May 2023
PokeBNN: A Binary Pursuit of Lightweight Accuracy
PokeBNN: A Binary Pursuit of Lightweight Accuracy
Yichi Zhang
Zhiru Zhang
Lukasz Lew
MQ
35
57
0
30 Nov 2021
Universal Deep Neural Network Compression
Universal Deep Neural Network Compression
Yoojin Choi
Mostafa El-Khamy
Jungwon Lee
MQ
86
85
0
07 Feb 2018
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
337
1,049
0
10 Feb 2017
1