ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.01081
  4. Cited By
Neural Network Layer Algebra: A Framework to Measure Capacity and
  Compression in Deep Learning

Neural Network Layer Algebra: A Framework to Measure Capacity and Compression in Deep Learning

2 July 2021
Alberto Badías
A. Banerjee
ArXivPDFHTML

Papers citing "Neural Network Layer Algebra: A Framework to Measure Capacity and Compression in Deep Learning"

2 / 2 papers shown
Title
A Distance Correlation-Based Approach to Characterize the Effectiveness
  of Recurrent Neural Networks for Time Series Forecasting
A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting
Christopher Salazar
A. Banerjee
AI4TS
26
2
0
28 Jul 2023
Size and Depth Separation in Approximating Benign Functions with Neural
  Networks
Size and Depth Separation in Approximating Benign Functions with Neural Networks
Gal Vardi
Daniel Reichman
T. Pitassi
Ohad Shamir
33
7
0
30 Jan 2021
1