ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.06898
  4. Cited By
Towards quantifying information flows: relative entropy in deep neural
  networks and the renormalization group

Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

14 July 2021
J. Erdmenger
Kevin T. Grosvenor
R. Jefferson
ArXivPDFHTML

Papers citing "Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group"

9 / 9 papers shown
Title
Bayesian RG Flow in Neural Network Field Theories
Bayesian RG Flow in Neural Network Field Theories
Jessica N. Howard
Marc S. Klinger
Anindita Maiti
A. G. Stapleton
68
1
0
27 May 2024
Wilsonian Renormalization of Neural Network Gaussian Processes
Wilsonian Renormalization of Neural Network Gaussian Processes
Jessica N. Howard
Ro Jefferson
Anindita Maiti
Z. Ringel
BDL
64
3
0
09 May 2024
Tensor networks for interpretable and efficient quantum-inspired machine
  learning
Tensor networks for interpretable and efficient quantum-inspired machine learning
Shirli Ran
Gang Su
39
7
0
19 Nov 2023
Neural Network Field Theories: Non-Gaussianity, Actions, and Locality
Neural Network Field Theories: Non-Gaussianity, Actions, and Locality
M. Demirtaş
James Halverson
Anindita Maiti
M. Schwartz
Keegan Stoner
AI4CE
23
10
0
06 Jul 2023
Bayesian Renormalization
Bayesian Renormalization
D. Berman
Marc S. Klinger
A. G. Stapleton
25
15
0
17 May 2023
Criticality versus uniformity in deep neural networks
Criticality versus uniformity in deep neural networks
A. Bukva
Jurriaan de Gier
Kevin T. Grosvenor
R. Jefferson
K. Schalm
Eliot Schwander
21
3
0
10 Apr 2023
The edge of chaos: quantum field theory and deep neural networks
The edge of chaos: quantum field theory and deep neural networks
Kevin T. Grosvenor
R. Jefferson
27
22
0
27 Sep 2021
Entropic alternatives to initialization
Entropic alternatives to initialization
Daniele Musso
37
1
0
16 Jul 2021
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
348
0
14 Jun 2018
1