ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.08604
18
5

Two-stage architectural fine-tuning with neural architecture search using early-stopping in image classification

17 February 2022
Youngkee Kim
Won Joon Yun
Youn Kyu Lee
Soyi Jung
Joongheon Kim
ArXivPDFHTML
Abstract

In many deep neural network (DNN) applications, the difficulty of gathering high-quality data in the industry field hinders the practical use of DNN. Thus, the concept of transfer learning has emerged, which leverages the pretrained knowledge of DNNs trained on large-scale datasets. Therefore, this paper suggests two-stage architectural fine-tuning, inspired by neural architecture search (NAS). One of main ideas is mutation, which reduces the search cost using given architectural information. Moreover, early-stopping is considered which cuts NAS costs by terminating the search process in advance. Experimental results verify our proposed method reduces 32.4% computational and 22.3% searching costs.

View on arXiv
Comments on this paper