ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.13302
  4. Cited By
Equivalence between algorithmic instability and transition to replica
  symmetry breaking in perceptron learning systems
v1v2 (latest)

Equivalence between algorithmic instability and transition to replica symmetry breaking in perceptron learning systems

26 November 2021
Yang Zhao
Junbin Qiu
Mingshan Xie
Haiping Huang
ArXiv (abs)PDFHTML

Papers citing "Equivalence between algorithmic instability and transition to replica symmetry breaking in perceptron learning systems"

8 / 8 papers shown
Title
Learning through atypical "phase transitions" in overparameterized
  neural networks
Learning through atypical "phase transitions" in overparameterized neural networks
Carlo Baldassi
Clarissa Lauditi
Enrico M. Malatesta
R. Pacelli
Gabriele Perugini
R. Zecchina
70
27
0
01 Oct 2021
Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric
  Perceptron
Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron
Emmanuel Abbe
Shuangping Li
Allan Sly
94
41
0
25 Feb 2021
Statistical physics of unsupervised learning with prior knowledge in
  neural networks
Statistical physics of unsupervised learning with prior knowledge in neural networks
Tianqi Hou
Haiping Huang
SSLAI4CE
52
11
0
06 Nov 2019
Minimal model of permutation symmetry in unsupervised learning
Minimal model of permutation symmetry in unsupervised learning
Tianqi Hou
K. Y. Michael Wong
Haiping Huang
45
20
0
30 Apr 2019
On the role of synaptic stochasticity in training low-precision neural
  networks
On the role of synaptic stochasticity in training low-precision neural networks
Carlo Baldassi
Federica Gerace
H. Kappen
Carlo Lucibello
Luca Saglietti
Enzo Tartaglione
R. Zecchina
41
23
0
26 Oct 2017
Unreasonable Effectiveness of Learning Neural Networks: From Accessible
  States and Robust Ensembles to Basic Algorithmic Schemes
Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes
Carlo Baldassi
C. Borgs
J. Chayes
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
60
168
0
20 May 2016
Subdominant Dense Clusters Allow for Simple Learning and High
  Computational Performance in Neural Networks with Discrete Synapses
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses
Carlo Baldassi
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
62
128
0
18 Sep 2015
Origin of the computational hardness for learning with binary synapses
Origin of the computational hardness for learning with binary synapses
Haiping Huang
Y. Kabashima
63
56
0
08 Aug 2014
1