The NN models we've discussed so far were designed by their authors. But, what if we could make the computer itself design the NN? Enter neural architecture search (NAS)—a technique that automates the design of NNs.
Before we continue, let's see what the network architecture consists of:
- The graph of operations, which represents the network. As we discussed in Chapter 1, The Nuts and bolts of Neural Networks, the operations include (but are not limited to) convolutions, activation functions, fully connected layers, normalization, and so on.
- The parameters of each operation. For example, the convolution parameters are: type (cross-channel, depthwise, and so on), input dimensions, number of input and output slices, stride, and padding.