ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.08666
  4. Cited By
Revisiting Training-free NAS Metrics: An Efficient Training-based Method

Revisiting Training-free NAS Metrics: An Efficient Training-based Method

16 November 2022
Taojiannan Yang
Linjie Yang
Xiaojie Jin
Chong Chen
ArXivPDFHTML

Papers citing "Revisiting Training-free NAS Metrics: An Efficient Training-based Method"

5 / 5 papers shown
Title
Anytime Neural Architecture Search on Tabular Data
Anytime Neural Architecture Search on Tabular Data
Naili Xing
Shaofeng Cai
Zhaojing Luo
Bengchin Ooi
Jian Pei
34
1
0
15 Mar 2024
Partial Fine-Tuning: A Successor to Full Fine-Tuning for Vision
  Transformers
Partial Fine-Tuning: A Successor to Full Fine-Tuning for Vision Transformers
Peng Ye
Yongqi Huang
Chongjun Tu
Minglei Li
Tao Chen
Tong He
Wanli Ouyang
38
4
0
25 Dec 2023
Efficient Architecture Search via Bi-level Data Pruning
Efficient Architecture Search via Bi-level Data Pruning
Chongjun Tu
Peng Ye
Weihao Lin
Hancheng Ye
Chong Yu
Tao Chen
Baopu Li
Wanli Ouyang
40
2
0
21 Dec 2023
Connection Sensitivity Matters for Training-free DARTS: From
  Architecture-Level Scoring to Operation-Level Sensitivity Analysis
Connection Sensitivity Matters for Training-free DARTS: From Architecture-Level Scoring to Operation-Level Sensitivity Analysis
Miao Zhang
Wei Huang
Li Wang
31
1
0
22 Jun 2021
Neural Architecture Search with Reinforcement Learning
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
271
5,327
0
05 Nov 2016
1