In this chapter, we learned methods for word embedding to find better representations of textual data elements. As neural networks and deep learning ingest larger amounts of text data, one-hot encoding and other methods of word representation become inefficient. We also learned how to visualize word embedding using t-SNE plots. We used a simple LSTM model to generate the text in TensorFlow and Keras. Similar concepts can be applied to various other tasks, such as sentiment analysis, question answering, and neural machine translation.
Before we dive deeper into advanced TensorFlow features such as Transfer Learning, Reinforcement Learning, Generative Networks, and Distributed TensorFlow, in the next chapter, we shall see how to take TensorFlow models into production.