Embedding is a mathematical structure contained within another instance. If we are embedding an object X in an object Y, we will be preserving the structure of the objects and so the instance.
Word embedding is a technique to map words to vectors, creating a multidimensional space that will allow the creation of similar representations for similar words. Each word is represented by a single vector with often tens or hundreds of dimensions, in contrast to other representations such as one-hot encoding that can have thousands or even millions of dimensions.
When we have words in the form of vectors, we end up using all mathematical techniques that we would on pure numbers. Also when transformed into vectors, words will keep the same proprieties that numbers have.
It also means that we could start doing operations as follows:
King - man + woman =queen...