Getting started with SRL
SRL is as difficult for humans as for machines. However, once again, transformers have taken a step closer to our human baselines.
In this section, we will first define SRL and visualize an example. We will then run a pretrained BERT-based model.
Let’s begin by defining the problematic task of SRL.
Defining semantic role labeling
Shi and Lin (2019) advanced and proved the idea that we can find who did what, and where, without depending on lexical or syntactic features. This chapter is based on Peng Shi and Jimmy Lin’s research at the University of Waterloo, California. They showed how transformers learn language structures better with attention layers.
SRL labels the semantic role as the role a word or group of words plays in a sentence and the relationship established with the predicate.
A semantic role is the role a noun or noun phrase plays in relation to the main verb in a sentence. For example, in the sentence Marvin...