In this chapter, we went through the idea of representation learning and why it's useful for doing deep learning or machine learning in general on input that's not in a real-valued form. Also, we covered one of the adopted techniques for converting words into real-valued vectors—Word2Vec—which has very interesting properties. Finally, we implemented the Word2Vec model using the skip-gram architecture.
Next up, you will see the practical use of these learned representations in a sentiment analysis example, where we need to convert the input text to real-valued vectors.