Text data can be viewed as a sequence of characters, words, sentences or paragraphs. Recurrent neural networks (RNN) have proven highly useful neural network architecture for sequences. For the purpose of applying neural network models to Natural Language Processing (NLP) tasks, the text is viewed as a sequence of words. This has proven highly successful for NLP tasks such as:
- Question answering
- Conversational agents or chatbots
- Document classification
- Sentiment analysis
- Image caption or description text generation
- Named entity recognition
- Speech recognition and tagging
NLP with TensorFlow deep learning techniques is a vast area and difficult to capture in one chapter. Hence, we have attempted to equip you with the most prevalent and important examples in this area using Tensorflow and Keras. Do not forget to explore and experiment...