Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2107.06898
Cited By
Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
14 July 2021
J. Erdmenger
Kevin T. Grosvenor
R. Jefferson
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group"
9 / 9 papers shown
Title
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
68
1
0
27 May 2024
Wilsonian Renormalization of Neural Network Gaussian Processes
Jessica N. Howard
Ro Jefferson
Anindita Maiti
Z. Ringel
BDL
64
3
0
09 May 2024
Tensor networks for interpretable and efficient quantum-inspired machine learning
Shirli Ran
Gang Su
39
7
0
19 Nov 2023
Neural Network Field Theories: Non-Gaussianity, Actions, and Locality
M. Demirtaş
James Halverson
Anindita Maiti
M. Schwartz
Keegan Stoner
AI4CE
23
10
0
06 Jul 2023
Bayesian Renormalization
D. Berman
Marc S. Klinger
A. G. Stapleton
25
15
0
17 May 2023
Criticality versus uniformity in deep neural networks
A. Bukva
Jurriaan de Gier
Kevin T. Grosvenor
R. Jefferson
K. Schalm
Eliot Schwander
21
3
0
10 Apr 2023
The edge of chaos: quantum field theory and deep neural networks
Kevin T. Grosvenor
R. Jefferson
27
22
0
27 Sep 2021
Entropic alternatives to initialization
Daniele Musso
37
1
0
16 Jul 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
1