Transferrable Surrogates in Expressive Neural Architecture Search Spaces

¹University of Edinburgh, ²Charles University,
³The Czech Academy of Sciences Institute of Computer Science, ⁴University of Siegen

*Indicates Equal Contribution
This study is an extension of

Abstract

Neural architecture search (NAS) faces a challenge in balancing the exploration of expressive, broad search spaces that enable architectural innovation with the need for efficient evaluation of architectures to effectively search such spaces. We investigate surrogate model training for improving search in highly expressive NAS search spaces based on context-free grammars. We show that i) surrogate models trained either using zero-cost-proxy metrics and neural graph features (GRAF) or by fine-tuning an off-the-shelf LM have high predictive power for the performance of architectures both within and across datasets, ii) these surrogates can be used to filter out bad architectures when searching on novel datasets, thereby significantly speeding up search and achieving better final performances, and iii) the surrogates can be further used directly as the search objective for huge speed-ups.

TL;DR: We accelerate NAS in expressive search spaces by using transferrable performance prediction methods.

MY ALT TEXT

Search results using different surrogates. We plot the validation accuracies of the best models at each iteration of search, with the mean and standard error across three seeds. The BERT-based surrogate (red) is the best-performing surrogate, showing consistent improvements across most datasets.

Cite us!

@misc{qin2025transferrablesurrogatesexpressiveneural,
        title={Transferrable Surrogates in Expressive Neural Architecture Search Spaces}, 
        author={Shiwen Qin and Gabriela Kadlecová and Martin Pilát and Shay B. Cohen and Roman Neruda and Elliot J. Crowley and Jovita Lukasik and Linus Ericsson},
        year={2025},
        eprint={2504.12971},
        archivePrefix={arXiv},
        primaryClass={cs.LG},
        url={https://arxiv.org/abs/2504.12971}, 
  }