Summary
We started off this chapter by understanding TensorFlow and how it uses computational graphs. We learned that computation in TensorFlow is represented by a computational graph, which consists of several nodes and edges, where nodes are mathematical operations, such as addition and multiplication, and edges are tensors.
Next, we learned that variables are containers used to store values, and they are used as input to several other operations in a computational graph. We also learned that placeholders are like variables, where we only define the type and dimension but do not assign the values, and the values for the placeholders are fed at runtime. Moving forward, we learned about TensorBoard, which is TensorFlow's visualization tool and can be used to visualize a computational graph. We also explored eager execution, which is more Pythonic and allows rapid prototyping.
We understood that, unlike graph mode, where we need to construct a graph every time to perform...