ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.19816
  4. Cited By
Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them
  Optimally

Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them Optimally

30 May 2024
Manon Verbockhaven
Sylvain Chevallier
Guillaume Charpiat
ArXiv (abs)PDFHTML

Papers citing "Growing Tiny Networks: Spotting Expressivity Bottlenecks and Fixing Them Optimally"

3 / 3 papers shown
Title
Growth strategies for arbitrary DAG neural architectures
Growth strategies for arbitrary DAG neural architectures
Stella Douka
Manon Verbockhaven
Théo Rudkiewicz
Stéphane Rivaud
François P. Landes
Sylvain Chevallier
Guillaume Charpiat
AI4CE
91
0
0
17 Feb 2025
ANaGRAM: A Natural Gradient Relative to Adapted Model for efficient PINNs learning
ANaGRAM: A Natural Gradient Relative to Adapted Model for efficient PINNs learning
Nilo Schwencke
Cyril Furtlehner
162
1
0
14 Dec 2024
SensLI: Sensitivity-Based Layer Insertion for Neural Networks
SensLI: Sensitivity-Based Layer Insertion for Neural Networks
Evelyn Herberg
Roland A. Herzog
Frederik Köhne
Leonie Kreis
Anton Schiela
41
0
0
27 Nov 2023
1