ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2307.14373
  4. Cited By
Piecewise Linear Functions Representable with Infinite Width Shallow
  ReLU Neural Networks

Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks

25 July 2023
Sarah McCarty
ArXivPDFHTML

Papers citing "Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks"

1 / 1 papers shown
Title
Hidden Unit Specialization in Layered Neural Networks: ReLU vs.
  Sigmoidal Activation
Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
Elisa Oostwal
Michiel Straat
Michael Biehl
MLT
58
55
0
16 Oct 2019
1