ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.00057
27
1
v1v2 (latest)

CP-NAS: Child-Parent Neural Architecture Search for 1-bit CNNs

30 April 2020
Lian Zhuo
Baochang Zhang
Hanlin Chen
Linlin Yang
Chong Chen
Y. Zhu
David Doermann
    MQ
ArXiv (abs)PDFHTML
Abstract

Neural architecture search (NAS) proves to be among the best approaches for many tasks by generating an application-adaptive neural architecture, which is still challenged by high computational cost and memory consumption. At the same time, 1-bit convolutional neural networks (CNNs) with binarized weights and activations show their potential for resource-limited embedded devices. One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS by taking advantage of the strengths of each in a unified framework. To this end, a Child-Parent (CP) model is introduced to a differentiable NAS to search the binarized architecture (Child) under the supervision of a full-precision model (Parent). In the search stage, the Child-Parent model uses an indicator generated by the child and parent model accuracy to evaluate the performance and abandon operations with less potential. In the training stage, a kernel-level CP loss is introduced to optimize the binarized network. Extensive experiments demonstrate that the proposed CP-NAS achieves a comparable accuracy with traditional NAS on both the CIFAR and ImageNet databases. It achieves the accuracy of 95.27%95.27\%95.27% on CIFAR-10, 64.3%64.3\%64.3% on ImageNet with binarized weights and activations, and a 30%30\%30% faster search than prior arts.

View on arXiv
Comments on this paper