ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.07578
  4. Cited By
Properties of the geometry of solutions and capacity of multi-layer
  neural networks with Rectified Linear Units activations

Properties of the geometry of solutions and capacity of multi-layer neural networks with Rectified Linear Units activations

17 July 2019
Carlo Baldassi
Enrico M. Malatesta
R. Zecchina
    MLT
ArXivPDFHTML

Papers citing "Properties of the geometry of solutions and capacity of multi-layer neural networks with Rectified Linear Units activations"

7 / 7 papers shown
Title
High-dimensional manifold of solutions in neural networks: insights from statistical physics
High-dimensional manifold of solutions in neural networks: insights from statistical physics
Enrico M. Malatesta
73
4
0
20 Feb 2025
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networks
Exact full-RSB SAT/UNSAT transition in infinitely wide two-layer neural networks
B. Annesi
Enrico M. Malatesta
Francesco Zamponi
61
2
0
09 Oct 2024
Activation function dependence of the storage capacity of treelike
  neural networks
Activation function dependence of the storage capacity of treelike neural networks
Jacob A. Zavatone-Veth
Cengiz Pehlevan
17
15
0
21 Jul 2020
Shaping the learning landscape in neural networks around wide flat
  minima
Shaping the learning landscape in neural networks around wide flat minima
Carlo Baldassi
Fabrizio Pittorino
R. Zecchina
MLT
36
82
0
20 May 2019
Unreasonable Effectiveness of Learning Neural Networks: From Accessible
  States and Robust Ensembles to Basic Algorithmic Schemes
Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes
Carlo Baldassi
C. Borgs
J. Chayes
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
45
166
0
20 May 2016
Subdominant Dense Clusters Allow for Simple Learning and High
  Computational Performance in Neural Networks with Discrete Synapses
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses
Carlo Baldassi
Alessandro Ingrosso
Carlo Lucibello
Luca Saglietti
R. Zecchina
42
127
0
18 Sep 2015
Origin of the computational hardness for learning with binary synapses
Origin of the computational hardness for learning with binary synapses
Haiping Huang
Y. Kabashima
43
54
0
08 Aug 2014
1