Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.03841
Cited By
Width is Less Important than Depth in ReLU Neural Networks
8 February 2022
Gal Vardi
Gilad Yehudai
Ohad Shamir
3DV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Width is Less Important than Depth in ReLU Neural Networks"
6 / 6 papers shown
Title
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
Shen-Huan Lyu
Jin-Hui Wu
Qin-Cheng Zheng
Baoliu Ye
39
0
0
06 Jul 2024
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
26
0
0
25 May 2023
Multi-Path Transformer is Better: A Case Study on Neural Machine Translation
Ye Lin
Shuhan Zhou
Yanyang Li
Anxiang Ma
Tong Xiao
Jingbo Zhu
38
0
0
10 May 2023
Exponential Separations in Symmetric Neural Networks
Aaron Zweig
Joan Bruna
32
7
0
02 Jun 2022
Size and Depth Separation in Approximating Benign Functions with Neural Networks
Gal Vardi
Daniel Reichman
T. Pitassi
Ohad Shamir
26
7
0
30 Jan 2021
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1