Multi-trial Neural Architecture Search with Lottery Tickets
- ViT

In this paper, we propose MENAS, an efficient multi-trial evolution-based NAS method with less human intervention. Specifically, we propose an enlarged search space (MobileNet3-MT) for ImageNet-1K and improve the search efficiency from two aspects. First, MENAS jointly explores architectures and optimal pruned candidates (Lottery Tickets), gradually slimming the average model in populations. Each model is trained with an early stop and replaced by its Lottery Tickets, instead of first searching for a cumbersome network then conducting pruning. Second, we introduce individual weight sharing, which is dedicated to multi-trial NAS, aiming to amortize the training costs by sharing weights between parents and child networks. Compared with weight sharing in supernet, individual weight sharing attains more reliable rank consistency, meanwhile is easy to implement by preventing the sophisticated supernet training. Moreover, to regularize the evolutionary process from trapped in small models, we preserve a small ratio of the largest models when formulating parent populations, which is proved beneficial to enhance model performance. Extensive experiment results demonstrate the superiority of MENAS. On the ImageNet-1K database, MENAS achieves 80.5% top-1 accuracy without involving knowledge distillation or larger image resolution. Code and models will be available.
View on arXiv