ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.00771
  4. Cited By
A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth
  Trade-Off

A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off

3 June 2019
Yaniv Blumenfeld
D. Gilboa
Daniel Soudry
    MQ
ArXivPDFHTML

Papers citing "A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off"

4 / 4 papers shown
Title
VC dimension of partially quantized neural networks in the
  overparametrized regime
VC dimension of partially quantized neural networks in the overparametrized regime
Yutong Wang
Clayton D. Scott
25
1
0
06 Oct 2021
Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural
  Network Initialization?
Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?
Yaniv Blumenfeld
D. Gilboa
Daniel Soudry
ODL
30
13
0
02 Jul 2020
Neural Tangents: Fast and Easy Infinite Neural Networks in Python
Neural Tangents: Fast and Easy Infinite Neural Networks in Python
Roman Novak
Lechao Xiao
Jiri Hron
Jaehoon Lee
Alexander A. Alemi
Jascha Narain Sohl-Dickstein
S. Schoenholz
38
225
0
05 Dec 2019
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
244
350
0
14 Jun 2018
1