ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.13069
  4. Cited By
Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric
  Perceptron

Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron

25 February 2021
Emmanuel Abbe
Shuangping Li
Allan Sly
ArXivPDFHTML

Papers citing "Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron"

9 / 9 papers shown
Title
High-dimensional manifold of solutions in neural networks: insights from statistical physics
High-dimensional manifold of solutions in neural networks: insights from statistical physics
Enrico M. Malatesta
75
4
0
20 Feb 2025
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networks
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networks
B. Annesi
Enrico M. Malatesta
Francesco Zamponi
71
3
0
09 Oct 2024
Clustering of solutions in the symmetric binary perceptron
Clustering of solutions in the symmetric binary perceptron
Carlo Baldassi
R. D. Vecchia
Carlo Lucibello
R. Zecchina
30
16
0
15 Nov 2019
Unreasonable Effectiveness of Learning Neural Networks: From Accessible
  States and Robust Ensembles to Basic Algorithmic Schemes
Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes
Carlo Baldassi
C. Borgs
J. Chayes
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
57
168
0
20 May 2016
Local entropy as a measure for sampling solutions in Constraint
  Satisfaction Problems
Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems
Carlo Baldassi
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
39
58
0
18 Nov 2015
Subdominant Dense Clusters Allow for Simple Learning and High
  Computational Performance in Neural Networks with Discrete Synapses
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses
Carlo Baldassi
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
52
128
0
18 Sep 2015
Origin of the computational hardness for learning with binary synapses
Origin of the computational hardness for learning with binary synapses
Haiping Huang
Y. Kabashima
55
55
0
08 Aug 2014
Discrete perceptrons
Discrete perceptrons
M. Stojnic
50
22
0
17 Jun 2013
Entropy landscape of solutions in the binary perceptron problem
Entropy landscape of solutions in the binary perceptron problem
Haiping Huang
K. Y. Michael Wong
Y. Kabashima
58
35
0
10 Apr 2013
1