Skip to main content

Performing Neural Architecture Search Without Gradients

Pavel Rumiantsev (McGill University); Mark Coates (McGill University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

One of the most crucial concerns in Neural Architecture Search (NAS) is the amount of resources and time required for the search. In this paper we address the problem of training-free (or zero-shot) search. We propose a generalized architecture search framework called enhanced training-free neural architecture search (ETE-NAS). It ranks architectures according to a user defined combination of zero-shot ranking functions. We instantiate it with metrics based on neural network Gaussian process (NNGP) kernel and ReLU regions division. This allows much faster search on a single GPU than other training-free approaches without sacrificing accuracy or inducing instability. Code is available at https://github.com/Rufaim/ETENAS

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00