Understanding general hyperparameter search-based NAS
In ML, parameters typically refer to the weights and biases that a model learns during training, while hyperparameters are values that are set before training begins and influence how the model learns. Examples of hyperparameters include learning rate and batch size. General hyperparameter search optimization algorithms are a type of NAS method to automatically search for the best hyperparameters to use for constructing a given NN architecture. Let’s go through a few of the possible hyperparameters. In a multi-layer perceptron (MLP), hyperparameters could be the number of layers that control the depth of the MLP, the width of each of the layers, and the type of intermediate layer activation used. In a CNN, hyperparameters could be the filter size of the convolutional layer, the stride size of each of the layers, and the type of intermediate layer activation used after each convolutional layer.
For NN architectures, the...