Seq2seq models
In 2016, Google announced that it had replaced the entire Google Translate algorithm with a single neural network. The special thing about the Google Neural Machine Translation system is that it translates mutliple languages "end-to-end" using only a single model. It works by encoding the semantics of a sentence and then decoding the semantics into the desired output language.
The fact that such a system is possible at all baffled many linguists and other researchers, as it shows that machine learning can create systems that accurately capture high-level meanings and semantics without being given any explicit rules.
These semantic meanings are represented as an encoding vector, and while we don't quite yet know how to interpret these vectors, there are a lot of useful applications for them. Translating from one language to another is one such popular method, but we could use a similar approach to "translate" a report into a summary. Text summarization has made great strides...