We're already aware of the power of neural networks first hand when we used word embeddings. This is one aspect of neural networks – using parts of the architecture itself to get useful information, but neural networks are far from limited to this. When we start using deeper networks, it is not prudent to use the weights to extract useful information – in these cases; we are more interested in the natural output of the neural network. We can train neural networks to perform multiple tasks to do with text analysis – indeed, for some of these tasks, the introduction of neural networks have completely changed how we approach the task.
A popular example here is Language Translation, and in particular, Google's Neural Translation model. Starting from until September 2016 Google used statistical and rule-based methods...