ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.03414
21
2

Using UNet and PSPNet to explore the reusability principle of CNN parameters

8 August 2020
Wei Wang
    SSLSSeg
ArXiv (abs)PDFHTML
Abstract

How to reduce the requirement on training dataset size is a hot topic in deep learning community. One straightforward way is to reuse some pre-trained parameters. Some previous work like Deep transfer learning reuse the model parameters trained for the first task as the starting point for the second task, and semi-supervised learning is trained upon a combination of labeled and unlabeled data. However, the fundamental reason of the success of these methods is unclear. In this paper, the reusability of parameters in each layer of a deep convolutional neural network is experimentally quantified by using a network to do segmentation and auto-encoder task. This paper proves that network parameters can be reused for two reasons: first, the network features are general; Second, there is little difference between the pre-trained parameters and the ideal network parameters. Through the use of parameter replacement and comparison, we demonstrate that reusability is different in BN(Batch Normalization)[7] layer and Convolution layer and some observations: (1)Running mean and running variance plays an important role than Weight and Bias in BN layer.(2)The weight and bias can be reused in BN layers.( 3) The network is very sensitive to the weight of convolutional layer.(4) The bias in Convolution layers are not sensitive, and it can be reused directly.

View on arXiv
Comments on this paper