Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Natural Language Processing with Python

You're reading from   Hands-On Natural Language Processing with Python A practical guide to applying deep learning architectures to your NLP applications

Arrow left icon
Product type Paperback
Published in Jul 2018
Publisher Packt
ISBN-13 9781789139495
Length 312 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (5):
Arrow left icon
Rajalingappaa Shanmugamani Rajalingappaa Shanmugamani
Author Profile Icon Rajalingappaa Shanmugamani
Rajalingappaa Shanmugamani
Chaitanya Joshi Chaitanya Joshi
Author Profile Icon Chaitanya Joshi
Chaitanya Joshi
Auguste Byiringiro Auguste Byiringiro
Author Profile Icon Auguste Byiringiro
Auguste Byiringiro
Rajesh Arumugam Rajesh Arumugam
Author Profile Icon Rajesh Arumugam
Rajesh Arumugam
Karthik Muthuswamy Karthik Muthuswamy
Author Profile Icon Karthik Muthuswamy
Karthik Muthuswamy
+1 more Show less
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Getting Started FREE CHAPTER 2. Text Classification and POS Tagging Using NLTK 3. Deep Learning and TensorFlow 4. Semantic Embedding Using Shallow Models 5. Text Classification Using LSTM 6. Searching and DeDuplicating Using CNNs 7. Named Entity Recognition Using Character LSTM 8. Text Generation and Summarization Using GRUs 9. Question-Answering and Chatbots Using Memory Networks 10. Machine Translation Using the Attention-Based Model 11. Speech Recognition Using DeepSpeech 12. Text-to-Speech Using Tacotron 13. Deploying Trained Models 14. Other Books You May Enjoy

Semantic Embedding Using Shallow Models

In this chapter, we will discuss the motivation for understanding semantic relationships between words, and we will discuss approaches for identifying such relationships. In the process, we will obtain a vector representation for words, which will let us build vector representations at a document level.

We will cover the following topics in this chapter:

  • Word embeddings, to represent words as vectors, trained by a simple shallow neural network
  • Continuous Bag of Words (CBOW) embeddings, to predict a target from a source word, using a similar neural network
  • Sentence embeddings, through averaging Word2vec
  • Document embeddings, through averaging across the document
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image