Summary
In this chapter, we learned about computational graphs, which show calculation processes visually. We looked at a computational graph that described backpropagation in a neural network and implemented processing in a neural network with layers, including the ReLU layer, Softmax-with-Loss layer, Affine layer, and Softmax layer. These layers have forward and backward methods and can calculate the gradients of weight parameters efficiently by propagating data both forward and backward in direction. By using layers as modules, you can combine them freely in a neural network so that you can build the desired network easily. The following points were covered in this chapter:
- We can use computational graphs to show calculation processes visually.
- A node in a computational graph consists of local calculations. Local calculations constitute the whole calculation.
- Performing forward propagation in a computational graph leads to a regular calculation. Meanwhile, performing...