Neural architecture search
Selecting models can be challenging. In the case of regression, that is, predicting a numerical value, you have a choice of linear regression, decision trees, random forest, lasso versus ridge regression, k-means elastic net, gradient boosting methods, including XGBoost, and SVMs, among many others.
For classification, that in other words, separating out things by classes, you have logistic regression, random forest, AdaBoost, gradient boost, and SVM-based classifiers at your disposal.
Neural architecture has the notion of search space, which defines which architectures can be used in principle. Then, a search strategy must be defined that outlines how to explore using the exploration-exploitation trade-off. Finally, there has to be a performance estimation strategy, which estimates the candidate's performance. This includes training and validation of the architecture.
There are several techniques for performing the exploration of search...