ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.12942
  4. Cited By
'Place-cell' emergence and learning of invariant data with restricted
  Boltzmann machines: breaking and dynamical restoration of continuous
  symmetries in the weight space

'Place-cell' emergence and learning of invariant data with restricted Boltzmann machines: breaking and dynamical restoration of continuous symmetries in the weight space

30 December 2019
Moshir Harsh
J. Tubiana
Simona Cocco
R. Monasson
ArXivPDFHTML

Papers citing "'Place-cell' emergence and learning of invariant data with restricted Boltzmann machines: breaking and dynamical restoration of continuous symmetries in the weight space"

11 / 11 papers shown
Title
Cascade of phase transitions in the training of Energy-based models
Cascade of phase transitions in the training of Energy-based models
Dimitrios Bachtis
Giulio Biroli
A. Decelle
Beatriz Seoane
65
4
0
23 May 2024
Three Factors Influencing Minima in SGD
Three Factors Influencing Minima in SGD
Stanislaw Jastrzebski
Zachary Kenton
Devansh Arpit
Nicolas Ballas
Asja Fischer
Yoshua Bengio
Amos Storkey
76
463
0
13 Nov 2017
A Bayesian Perspective on Generalization and Stochastic Gradient Descent
A Bayesian Perspective on Generalization and Stochastic Gradient Descent
Samuel L. Smith
Quoc V. Le
BDL
61
251
0
17 Oct 2017
Mutual Information, Neural Networks and the Renormalization Group
Mutual Information, Neural Networks and the Renormalization Group
M. Koch-Janusz
Zohar Ringel
DRL
AI4CE
73
176
0
20 Apr 2017
Phase transitions in Restricted Boltzmann Machines with generic priors
Phase transitions in Restricted Boltzmann Machines with generic priors
Adriano Barra
G. Genovese
Peter Sollich
Daniele Tantari
38
60
0
09 Dec 2016
Emergence of Compositional Representations in Restricted Boltzmann
  Machines
Emergence of Compositional Representations in Restricted Boltzmann Machines
J. Tubiana
R. Monasson
AI4CE
43
91
0
21 Nov 2016
Entropy-SGD: Biasing Gradient Descent Into Wide Valleys
Entropy-SGD: Biasing Gradient Descent Into Wide Valleys
Pratik Chaudhari
A. Choromańska
Stefano Soatto
Yann LeCun
Carlo Baldassi
C. Borgs
J. Chayes
Levent Sagun
R. Zecchina
ODL
94
773
0
06 Nov 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
394
2,934
0
15 Sep 2016
Nonlinear Hebbian learning as a unifying principle in receptive field
  formation
Nonlinear Hebbian learning as a unifying principle in receptive field formation
Carlos S. N. Brito
W. Gerstner
33
64
0
04 Jan 2016
An exact mapping between the Variational Renormalization Group and Deep
  Learning
An exact mapping between the Variational Renormalization Group and Deep Learning
Pankaj Mehta
D. Schwab
AI4CE
68
309
0
14 Oct 2014
Representation Learning: A Review and New Perspectives
Representation Learning: A Review and New Perspectives
Yoshua Bengio
Aaron Courville
Pascal Vincent
OOD
SSL
220
12,422
0
24 Jun 2012
1