Summary
NAS is a method that is generalized to any NN type, allowing for the automation of creating new and advanced NNs without the need for manual neural architecture design. As you may have guessed, NAS dominates the image-based field of NNs. The EfficientNet model family exemplifies the impact NAS provides to the image-based NN field. This is due to the inherent availability of a wide variety of CNN components that make it more complicated to design when compared to a simple MLP. For sequential or time-series data handling, there are not many variations of RNN cells, and thus the bulk of work in NAS for RNNs is focused on designing a custom recurrent cell. More work could have been done to accommodate transformers as it is the current state of the art, capable of being adapted to a variety of data modalities.
NAS is mainly adopted by researchers or practitioners in larger institutions. One of the key traits practitioners want when trying to train better models for their use...