Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Natural Language Processing with PyTorch 1.x

You're reading from   Hands-On Natural Language Processing with PyTorch 1.x Build smart, AI-driven linguistic applications using deep learning and NLP techniques

Arrow left icon
Product type Paperback
Published in Jul 2020
Publisher Packt
ISBN-13 9781789802740
Length 276 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Thomas Dop Thomas Dop
Author Profile Icon Thomas Dop
Thomas Dop
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Section 1: Essentials of PyTorch 1.x for NLP
2. Chapter 1: Fundamentals of Machine Learning and Deep Learning FREE CHAPTER 3. Chapter 2: Getting Started with PyTorch 1.x for NLP 4. Section 2: Fundamentals of Natural Language Processing
5. Chapter 3: NLP and Text Embeddings 6. Chapter 4: Text Preprocessing, Stemming, and Lemmatization 7. Section 3: Real-World NLP Applications Using PyTorch 1.x
8. Chapter 5: Recurrent Neural Networks and Sentiment Analysis 9. Chapter 6: Convolutional Neural Networks for Text Classification 10. Chapter 7: Text Translation Using Sequence-to-Sequence Neural Networks 11. Chapter 8: Building a Chatbot Using Attention-Based Neural Networks 12. Chapter 9: The Road Ahead 13. Other Books You May Enjoy

Next steps

While we have shown our sequence-to-sequence model to be effective at performing language translation, the model we trained from scratch is not a perfect translator by any means. This is, in part, due to the relatively small size of our training data. We trained our model on a set of 30,000 English/German sentences. While this might seem very large, in order to train a perfect model, we would require a training set that's several orders of magnitude larger.

In theory, we would require several examples of each word in the entire English and German languages for our model to truly understand its context and meaning. For context, the 30,000 English sentences in our training set consisted of just 6,000 unique words. The average vocabulary of an English speaker is said to be between 20,000 and 30,000 words, which gives us an idea of just how many examples sentences we would need to train a model that performs perfectly. This is probably why the most accurate translation...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime