ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.11522
11
0

Conditional super-network weights

23 April 2021
Kevin Laube
A. Zell
ArXivPDFHTML
Abstract

Modern Neural Architecture Search methods have repeatedly broken state-of-the-art results for several disciplines. The super-network, a central component of many such methods, enables quick estimates of accuracy or loss statistics for any architecture in the search space. They incorporate the network weights of all candidate architectures and can thus approximate specific ones by applying the respective operations. However, this design ignores potential dependencies between consecutive operations. We extend super-networks with conditional weights that depend on combinations of choices and analyze their effect. Experiments in NAS-Bench 201 and NAS-Bench-Macro-based search spaces show improvements in the architecture selection and that the resource overhead is nearly negligible for sequential network designs.

View on arXiv
Comments on this paper