Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Natural Language Processing with TensorFlow

You're reading from   Natural Language Processing with TensorFlow Teach language to machines using Python's deep learning library

Arrow left icon
Product type Paperback
Published in May 2018
Publisher Packt
ISBN-13 9781788478311
Length 472 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Thushan Ganegedara Thushan Ganegedara
Author Profile Icon Thushan Ganegedara
Thushan Ganegedara
Motaz Saad Motaz Saad
Author Profile Icon Motaz Saad
Motaz Saad
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Introduction to Natural Language Processing FREE CHAPTER 2. Understanding TensorFlow 3. Word2vec – Learning Word Embeddings 4. Advanced Word2vec 5. Sentence Classification with Convolutional Neural Networks 6. Recurrent Neural Networks 7. Long Short-Term Memory Networks 8. Applications of LSTM – Generating Text 9. Applications of LSTM – Image Caption Generation 10. Sequence-to-Sequence Learning – Neural Machine Translation 11. Current Trends and the Future of Natural Language Processing A. Mathematical Foundations and Advanced TensorFlow Index

Improving NMTs


As you can see from the preceding results, our translation model is not behaving ideally. These results were obtained by running the optimization for more than 12 hours on a single NVIDIA 1080 Ti GPU. Also note that this is not even the full dataset, we only used 250,000 sentence pairs for training. However, if you type something into Google Translate, which uses the Google Neural Machine Translation (GNMT) system, the translation almost always looks very realistic with only minor mistakes. So it is important to know how we can improve the model so that it can produce better results. In this section, we will discuss several ways of improving NMTs such as teacher forcing, deep LSTMs, and attention mechanism.

Teacher forcing

As we discussed in the Training the NMT section, we do the following to train the NMT:

  • First, we fed the full encoder sentence to obtain the final state outputs of the encoder

  • We then set the final states of the encoder to be the initial state of the decoder...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime