Modern Neural Architecture Search methods have repeatedly broken state-of-the-art results for several disciplines. The super-network, a central component of many such methods, enables quick estimates of accuracy or loss statistics for any architecture in the search space. They incorporate the network weights of all candidate architectures and can thus approximate specific ones by applying the respective operations. However, this design ignores potential dependencies between consecutive operations. We extend super-networks with conditional weights that depend on combinations of choices and analyze their effect. Experiments in NAS-Bench 201 and NAS-Bench-Macro-based search spaces show improvements in the architecture selection and that the resource overhead is nearly negligible for sequential network designs.
View on arXiv