ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.03841
  4. Cited By
Width is Less Important than Depth in ReLU Neural Networks

Width is Less Important than Depth in ReLU Neural Networks

8 February 2022
Gal Vardi
Gilad Yehudai
Ohad Shamir
    3DV
ArXivPDFHTML

Papers citing "Width is Less Important than Depth in ReLU Neural Networks"

6 / 6 papers shown
Title
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
The Role of Depth, Width, and Tree Size in Expressiveness of Deep Forest
Shen-Huan Lyu
Jin-Hui Wu
Qin-Cheng Zheng
Baoliu Ye
39
0
0
06 Jul 2024
Data Topology-Dependent Upper Bounds of Neural Network Widths
Data Topology-Dependent Upper Bounds of Neural Network Widths
Sangmin Lee
Jong Chul Ye
26
0
0
25 May 2023
Multi-Path Transformer is Better: A Case Study on Neural Machine
  Translation
Multi-Path Transformer is Better: A Case Study on Neural Machine Translation
Ye Lin
Shuhan Zhou
Yanyang Li
Anxiang Ma
Tong Xiao
Jingbo Zhu
38
0
0
10 May 2023
Exponential Separations in Symmetric Neural Networks
Exponential Separations in Symmetric Neural Networks
Aaron Zweig
Joan Bruna
32
7
0
02 Jun 2022
Size and Depth Separation in Approximating Benign Functions with Neural
  Networks
Size and Depth Separation in Approximating Benign Functions with Neural Networks
Gal Vardi
Daniel Reichman
T. Pitassi
Ohad Shamir
26
7
0
30 Jan 2021
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
148
602
0
14 Feb 2016
1