Appendix III — Generic Text Completion with GPT-2
This appendix is the detailed explanation of the Generic text completion with GPT-2 section in Chapter 7, The Rise of Suprahuman Transformers with GPT-3 Engines. This section describes how to implement a GPT-2 transformer model for generic text complexion.
You can read the usage of this notebook directly in Chapter 7 or build the program and run it in this appendix to get more profound knowledge of how a GPT model works.
We will clone the OpenAI_GPT_2
repository, download the 345M-parameter GPT-2 transformer model, and interact with it. We will enter context sentences and analyze the text generated by the transformer. The goal is to see how it creates new content.
This section is divided into nine steps. Open OpenAI_GPT_2.ipynb
in Google Colaboratory. The notebook is in the AppendixIII
directory of the GitHub repository of this book. You will notice that the notebook is also divided into the same nine steps and cells...