ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.09107
31
0

Shrinkage Initialization for Smooth Learning of Neural Networks

12 April 2025
Miao Cheng
Feiyan Zhou
Hongwei Zou
Limin Wang
    AI4CE
ArXivPDFHTML
Abstract

The successes of intelligent systems have quite relied on the artificial learning of information, which lead to the broad applications of neural learning solutions. As a common sense, the training of neural networks can be largely improved by specifically defined initialization, neuron layers as well as the activation functions. Though there are sequential layer based initialization available, the generalized solution to initial stages is still desired. In this work, an improved approach to initialization of neural learning is presented, which adopts the shrinkage approach to initialize the transformation of each layer of networks. It can be universally adapted for the structures of any networks with random layers, while stable performance can be attained. Furthermore, the smooth learning of networks is adopted in this work, due to the diverse influence on neural learning. Experimental results on several artificial data sets demonstrate that, the proposed method is able to present robust results with the shrinkage initialization, and competent for smooth learning of neural networks.

View on arXiv
@article{cheng2025_2504.09107,
  title={ Shrinkage Initialization for Smooth Learning of Neural Networks },
  author={ Miao Cheng and Feiyan Zhou and Hongwei Zou and Limin Wang },
  journal={arXiv preprint arXiv:2504.09107},
  year={ 2025 }
}
Comments on this paper