Method 2: SRL first
The transformer could not find who was driving to go to Las Vegas
and thought it was the Nat King Cole
instead of Jo
and Maria
.
What went wrong? Can we see what the transformers think and obtain an explanation? To find out, let's go back to semantic role modeling. If necessary, take a few minutes to review Chapter 9, Semantic Role Labeling with BERT-Based Transformers.
Let's run the same sequence on AllenNLP
, https://demo.allennlp.org/reading-comprehension, in the Semantic Role Labeling section to obtain a visual representation of the verb "drove
" in our sequence:
Figure 10.2: EER Semantic Role Labeling (SRL)
We can see the problem. The argument of the verb "driving
" is "they.
" There is no relationship established between "they
" and "Jo
" and "Maria
." It seems that the inference could be made.
Transformer models keep evolving. The output might vary; however, the...