Entering the syntax-free world of AI
SRL is as difficult for humans as for machines. However, transformers have taken us to the disruptive boundary of our human baselines.
A syntax-free approach to SRL is quite innovative. Classical NLP methods include dependency analysis, Part-of-Speech (POS) parsing, and learning about phrase structure. The classical approach trains the model to understand a sentence’s grammatical structure and syntax.
The model designed by Shi and Lin (2019) doesn’t apply an explicit syntax analysis process. The BERT model relies on its unique ability to understand the structure of a sentence. It implicitly captures syntactic features from the vast amount of data it learns how to represent. OpenAI took this approach further and decided to let its GPT models learn syntax without going through the tedious process of training a model for specific tasks, including SRL. As such, there are no syntax trees in a GPT model (syntax-free).
In this...