Skip-thoughts is one of the popular unsupervised learning algorithms for learning the sentence embedding. We can see skip-thoughts as an analogy to the skip-gram model. We learned that in the skip-gram model, we try to predict the context word given a target word, whereas in skip-thoughts, we try to predict the context sentence given a target sentence. In other words, we can say that skip-gram is used for learning word-level vectors and skip-thoughts is used for learning sentence-level vectors.
The algorithm of skip-thoughts is very simple. It consists of an encoder-decoder architecture. The role of the encoder is to map the sentence to a vector and the role of the decoder is to generate the surrounding sentences that is the previous and next sentence of the given input sentence. As shown in the following diagram, the skip-thoughts vector...