Chapter 11: Attention Visualization and Experiment Tracking
In this chapter, we will cover two different technical concepts, attention visualization and experiment tracking, and we will practice them through sophisticated tools such as exBERT and BertViz. These tools provide important functions for interpretability and explainability. First, we will discuss how to visualize the inner parts of attention by utilizing the tools. It is important to interpret the learned representations and to understand the information encoded by self-attention heads in the Transformer. We will see that certain heads correspond to a certain aspect of syntax or semantics. Secondly, we will learn how to track experiments by logging and then monitoring by using TensorBoard and Weights & Biases (W&B). These tools enable us to efficiently host and track experimental results such as loss or other metrics, which helps us to optimize model training. You will learn how to use exBERT and BertViz to see the...