ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.02321
69
18

Most Activation Functions Can Win the Lottery Without Excessive Depth

4 May 2022
R. Burkholz
    MLT
ArXivPDFHTML
Abstract

The strong lottery ticket hypothesis has highlighted the potential for training deep neural networks by pruning, which has inspired interesting practical and theoretical insights into how neural networks can represent functions. For networks with ReLU activation functions, it has been proven that a target network with depth LLL can be approximated by the subnetwork of a randomly initialized neural network that has double the target's depth 2L2L2L and is wider by a logarithmic factor. We show that a depth L+1L+1L+1 network is sufficient. This result indicates that we can expect to find lottery tickets at realistic, commonly used depths while only requiring logarithmic overparametrization. Our novel construction approach applies to a large class of activation functions and is not limited to ReLUs.

View on arXiv
Comments on this paper