References
- OpenAI and GPT-3 engines: https://beta.openai.com/docs/engines/engines
- BertViz GitHub Repository by Jesse Vig: https://github.com/jessevig/bertviz
- OpenAI's supercomputer: https://blogs.microsoft.com/ai/openai-azure-supercomputer/
- Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever, 2018, Improving Language Understanding by Generative Pre-Training: https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf
- Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever, 2019, Language Models are Unsupervised Multi-task Learners: https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf
- Common Crawl data: https://commoncrawl.org/big-picture/
GPT-4 Technical Report, OpenAI 2023, https://arxiv.org/pdf/2303.08774.pdf