Sentence-BERT was introduced by the Ubiquitous Knowledge Processing Lab (UKP-TUDA). As the name suggests, Sentence-BERT is used for obtaining fixed-length sentence representations. Sentence-BERT extends the pre-trained BERT model (or its variants) to obtain the sentence representation. Wait! Why do we need Sentence-BERT for obtaining sentence representations? We can directly use the vanilla BERT or its variants to obtain the sentence representation, right? Yes!
But one of the challenges with the vanilla BERT model is its high inference time. Say we have a dataset with number of sentences; then, to find a sentence pair with high similarity, it takes about computations.
To combat this high inference time, we use Sentence-BERT. Sentence-BERT drastically reduces the inference time of BERT. Sentence-BERT is popularly used in tasks such as sentence pair classification, computing similarity between two sentences, and so on. Before...