ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.01877
23
12

Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

3 October 2018
Yixi Xu
Tianlin Li
    MQ
ArXivPDFHTML
Abstract

This paper presents a general framework for norm-based capacity control for Lp,qL_{p,q}Lp,q​ weight normalized deep neural networks. We establish the upper bound on the Rademacher complexities of this family. With an Lp,qL_{p,q}Lp,q​ normalization where q≤p∗q\le p^*q≤p∗, and 1/p+1/p∗=11/p+1/p^{*}=11/p+1/p∗=1, we discuss properties of a width-independent capacity control, which only depends on depth by a square root term. We further analyze the approximation properties of Lp,qL_{p,q}Lp,q​ weight normalized deep neural networks. In particular, for an L1,∞L_{1,\infty}L1,∞​ weight normalized network, the approximation error can be controlled by the L1L_1L1​ norm of the output layer, and the corresponding generalization error only depends on the architecture by the square root of the depth.

View on arXiv
Comments on this paper