In this chapter, we learned about Caffe2 operators and how they differ from layers used in older deep learning frameworks. We built a simple computation graph by composing several operators. We then tackled the MNIST machine learning problem and built an MLP network using Brew helper functions. We loaded pretrained weights into this network and used it for inference on a batch of input images. We also introduced several common layers, such as matrix multiplication, fully connected, Sigmoid, SoftMax, and ReLU.
We learned about performing inference on our networks in this chapter. In the next chapter, we will learn about training and how to train a network to solve the MNIST problem.