ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.04942
  4. Cited By
Sample Variance Decay in Randomly Initialized ReLU Networks
v1v2 (latest)

Sample Variance Decay in Randomly Initialized ReLU Networks

13 February 2019
Kyle L. Luther
H. S. Seung
ArXiv (abs)PDFHTML

Papers citing "Sample Variance Decay in Randomly Initialized ReLU Networks"

2 / 2 papers shown
Title
Fractional moment-preserving initialization schemes for training deep
  neural networks
Fractional moment-preserving initialization schemes for training deep neural networks
Mert Gurbuzbalaban
Yuanhan Hu
107
3
0
25 May 2020
Neural networks are a priori biased towards Boolean functions with low
  entropy
Neural networks are a priori biased towards Boolean functions with low entropy
Chris Mingard
Joar Skalse
Guillermo Valle Pérez
David Martínez-Rubio
Vladimir Mikulik
A. Louis
FAttAI4CE
102
39
0
25 Sep 2019
1