What is NAS and how is it different from HPT?
Artificial Neural Networks or ANNs are widely used today for solving complex ML problems. Most of the time, these network architectures are hand-designed by ML experts, which may not be optimal every time. Neural Architecture Search or NAS is a technique that automates the process of designing neural network architectures that usually outperform hand-designed networks.
Although both HPT and NAS are used as model optimization techniques, there are certain differences in how they both work. HPT assumes a given architecture and focuses on optimizing the hyperparameters that lead to the best model. HPT optimizes hyperparameters such as learning rate, optimizer, batch size, activation function, and so on. NAS, on the other hand, focuses on optimizing architecture-specific parameters (in a way, it automates the process of designing a neural network architecture). NAS optimizes parameters such as the number of layers, number of units, types...