Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Transformers for Natural Language Processing

You're reading from   Transformers for Natural Language Processing Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more

Arrow left icon
Product type Paperback
Published in Jan 2021
Publisher Packt
ISBN-13 9781800565791
Length 384 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Denis Rothman Denis Rothman
Author Profile Icon Denis Rothman
Denis Rothman
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Getting Started with the Model Architecture of the Transformer 2. Fine-Tuning BERT Models FREE CHAPTER 3. Pretraining a RoBERTa Model from Scratch 4. Downstream NLP Tasks with Transformers 5. Machine Translation with the Transformer 6. Text Generation with OpenAI GPT-2 and GPT-3 Models 7. Applying Transformers to Legal and Financial Documents for AI Text Summarization 8. Matching Tokenizers and Datasets 9. Semantic Role Labeling with BERT-Based Transformers 10. Let Your Data Do the Talking: Story, Questions, and Answers 11. Detecting Customer Emotions to Make Predictions 12. Analyzing Fake News with Transformers 13. Other Books You May Enjoy
14. Index
Appendix: Answers to the Questions

Index

A

AllenNLP

Adversarial Generation sentence-pairs (SWAG) 52

Amazon Web Services (AWS) 169

associative neutral networks 3

B

benchmark tasks, SuperGLUE

BoolQ 114

Commitment Bank (CB) 114

defining 113

Multi-Sentence Reading Comprehension (MultiRC) 115, 116

Reading Comprehension with Commonsense Reasoning Dataset (ReCoRD) 116, 117

Recognizing Textual Entailment (RTE) 118

Winograd Schema Challenge (WSC) 118, 119

Words in Context (WiC) 118

BERT-based model

used, for performing SRL experiments 249

BERT-base multilingual model 307, 308

BERT model

fine-tuning 50, 52

pretraining 50, 52

BERT model, fine-tuning 53

attention masks, creating 59

batch size, selecting 60, 61

BERT tokenizer, activating 57

BERT tokens, creating 57

configuration, initializing 61, 62

CUDA, specifying 55

data, converting into torch sensors 60

data, processing 58

dataset, loading 55, 56, 57

...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image