ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.15995
29
0
v1v2 (latest)

Sensitivity-Based Layer Insertion for Residual and Feedforward Neural Networks

27 November 2023
Evelyn Herberg
Roland A. Herzog
Frederik Köhne
Leonie Kreis
Anton Schiela
ArXiv (abs)PDFHTMLGithub (2★)
Main:34 Pages
23 Figures
Bibliography:3 Pages
6 Tables
Appendix:1 Pages
Abstract

The training of neural networks requires tedious and often manual tuning of the network architecture. We propose a systematic method to insert new layers during the training process, which eliminates the need to choose a fixed network size before training. Our technique borrows techniques from constrained optimization and is based on first-order sensitivity information of the objective with respect to the virtual parameters that additional layers, if inserted, would offer. We consider fully connected feedforward networks with selected activation functions as well as residual neural networks. In numerical experiments, the proposed sensitivity-based layer insertion technique exhibits improved training decay, compared to not inserting the layer. Furthermore, the computational effort is reduced in comparison to inserting the layer from the beginning. The code is available at \url{https://github.com/LeonieKreis/layer_insertion_sensitivity_based}.

View on arXiv
@article{kreis2025_2311.15995,
  title={ SensLI: Sensitivity-Based Layer Insertion for Neural Networks },
  author={ Leonie Kreis and Evelyn Herberg and Frederik Köhne and Anton Schiela and Roland Herzog },
  journal={arXiv preprint arXiv:2311.15995},
  year={ 2025 }
}
Comments on this paper