Quick-thoughts is another interesting algorithm for learning the sentence embeddings. In skip-thoughts, we saw how we used the encoder-decoder architecture to learn the sentence embeddings. In quick-thoughts, we try to learn whether a given sentence is related to the candidate sentence. So, instead of using a decoder, we use a classifier to learn whether a given input sentence is related to the candidate sentence.
Let be the input sentence and be the set of candidate sentences containing both valid context and invalid context sentences related to the given input sentence . Let be any candidate sentence from the .
We use two encoding functions, and . The role of these two functions, and , is to learn the embeddings, that is, to learn the vector representations of a given sentence and candidate sentence , respectively.
Once these two...