Search icon CANCEL
Subscription
0
Cart icon
Cart
Close icon
You have no products in your basket yet
Save more on your purchases!
Savings automatically calculated. No voucher code required
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Transformers for Natural Language Processing

You're reading from  Transformers for Natural Language Processing

Product type Book
Published in Jan 2021
Publisher Packt
ISBN-13 9781800565791
Pages 384 pages
Edition 1st Edition
Languages
Author (1):
Denis Rothman Denis Rothman
Profile icon Denis Rothman
Toc

Table of Contents (16) Chapters close

Preface 1. Getting Started with the Model Architecture of the Transformer 2. Fine-Tuning BERT Models 3. Pretraining a RoBERTa Model from Scratch 4. Downstream NLP Tasks with Transformers 5. Machine Translation with the Transformer 6. Text Generation with OpenAI GPT-2 and GPT-3 Models 7. Applying Transformers to Legal and Financial Documents for AI Text Summarization 8. Matching Tokenizers and Datasets 9. Semantic Role Labeling with BERT-Based Transformers 10. Let Your Data Do the Talking: Story, Questions, and Answers 11. Detecting Customer Emotions to Make Predictions 12. Analyzing Fake News with Transformers 13. Other Books You May Enjoy
14. Index
Appendix: Answers to the Questions

Method 0: Trial and error

Question-answering seems very easy. Is that true? Let's find out.

Open QA.ipynb, the Google Colab notebook we will be using in this chapter. We will run the notebook cell by cell.

Run the first cell to install Hugging Face's transformers, the framework we will be implementing in this chapter:

!pip install -q transformers==4.0.0

We will now import Hugging Face's pipeline, which contains a vast amount of ready-to-use transformer resources. They provide high-level abstraction functions for the Hugging Face library resources to perform a wide range of tasks. We can access those NLP tasks through a simple API.

The pipeline is imported with one line of code:

from transformers import pipeline

Once that is done, we have one-line options to instantiate transformer models and tasks:

  1. Perform an NLP task with the default model and default tokenizer:
    pipeline("<task-name>")
    
    ...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime