ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.14586
  4. Cited By
FactorizeNet: Progressive Depth Factorization for Efficient Network
  Architecture Exploration Under Quantization Constraints

FactorizeNet: Progressive Depth Factorization for Efficient Network Architecture Exploration Under Quantization Constraints

30 November 2020
S. Yun
A. Wong
    MQ
ArXiv (abs)PDFHTML

Papers citing "FactorizeNet: Progressive Depth Factorization for Efficient Network Architecture Exploration Under Quantization Constraints"

2 / 2 papers shown
Title
Starting Positions Matter: A Study on Better Weight Initialization for Neural Network Quantization
Starting Positions Matter: A Study on Better Weight Initialization for Neural Network Quantization
S. Yun
A. Wong
MQ
104
0
0
12 Jun 2025
Do All MobileNets Quantize Poorly? Gaining Insights into the Effect of
  Quantization on Depthwise Separable Convolutional Networks Through the Eyes
  of Multi-scale Distributional Dynamics
Do All MobileNets Quantize Poorly? Gaining Insights into the Effect of Quantization on Depthwise Separable Convolutional Networks Through the Eyes of Multi-scale Distributional Dynamics
S. Yun
Alexander Wong
MQ
84
27
0
24 Apr 2021
1