Downstream NLP Tasks with Transformers
Transformers reveal their full potential when we unleash pretrained models and watch them perform downstream Natural Language Understanding (NLU) tasks. It takes a lot of time and effort to pretrain and fine-tune a transformer model, but the effort is worthwhile when we see a 355 million parameter transformer model in action on a range of NLU tasks.
We will begin this chapter with the quest to outperform the human baseline. The human baseline represents the performance of humans on an NLU task. Humans learn transduction at an early age and quickly develop inductive thinking. We humans perceive the world directly with our senses. Machine intelligence relies entirely on our perceptions transcribed into words to make sense of our language.
We will then see how to measure the performances of transformers. Measuring NLP tasks remains a straightforward approach involving accuracy scores in various forms based on true and false results. These...