Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Advanced Natural Language Processing with TensorFlow 2

You're reading from   Advanced Natural Language Processing with TensorFlow 2 Build effective real-world NLP applications using NER, RNNs, seq2seq models, Transformers, and more

Arrow left icon
Product type Paperback
Published in Feb 2021
Publisher Packt
ISBN-13 9781800200937
Length 380 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Tony Mullen Tony Mullen
Author Profile Icon Tony Mullen
Tony Mullen
Ashish Bansal Ashish Bansal
Author Profile Icon Ashish Bansal
Ashish Bansal
Arrow right icon
View More author details
Toc

Table of Contents (13) Chapters Close

Preface 1. Essentials of NLP 2. Understanding Sentiment in Natural Language with BiLSTMs FREE CHAPTER 3. Named Entity Recognition (NER) with BiLSTMs, CRFs, and Viterbi Decoding 4. Transfer Learning with BERT 5. Generating Text with RNNs and GPT-2 6. Text Summarization with Seq2seq Attention and Transformer Networks 7. Multi-Modal Networks and Image Captioning with ResNets and Transformer Networks 8. Weakly Supervised Learning for Classification with Snorkel 9. Building Conversational AI Applications with Deep Learning 10. Installation and Setup Instructions for Code 11. Other Books You May Enjoy
12. Index

Summary

In this chapter, we worked through the basics of NLP, including collecting and labeling training data, tokenization, stop word removal, case normalization, POS tagging, stemming, and lemmatization. Some vagaries of these in languages such as Japanese and Russian were also covered. Using a variety of features derived from these approaches, we trained a model to classify spam messages, where the messages had a combination of English and Bahasa Indonesian words. This got us to a model with 94% accuracy.

However, the major challenge in using the content of the messages was in defining a way to represent words as vectors such that computations could be performed on them. We started with a simple count-based vectorization scheme and then graduated to a more sophisticated TF-IDF approach, both of which produced sparse vectors. This TF-IDF approach gave a model with 98%+ accuracy in the spam detection task.

Finally, we saw a contemporary method of generating dense word embeddings, called Word2Vec. This method, though a few years old, is still very relevant in many production applications. Once the word embeddings are generated, they can be cached for inference and that makes an ML model using these embeddings run with relatively low latency.

We used a very basic deep learning model for solving the SMS spam classification task. Like how Convolutional Neural Networks (CNNs) are the predominant architecture in computer vision, Recurrent Neural Networks (RNNs), especially those based on Long Short-Term Memory (LSTM) and Bi-directional LSTMs (BiLSTMs), are most commonly used to build NLP models. In the next chapter, we cover the structure of LSTMs and build a sentiment analysis model using BiLSTMs. These models will be used extensively in creative ways to solve different NLP problems in future chapters.

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime