There is a wealth of word embeddings which we can choose from for our vectorization tasks – the original implementations of these methods are scattered around in different languages, hosting websites, binaries, and repositories – but luckily for us, Gensim comes to the rescue again, with implementations or well-documented wrappers for most (if not all) of other word embeddings.
Gensim has wrappers for WordRank, VarEmbed, and FastText, as well as native implementations for Poincare Embeddings and FastText. Gensim also has a neat script to use GloVe embeddings as well, which comes in handy when comparing between different kinds of embeddings.
Gensim's KeyedVectors class means that we have a base class to use all our word embeddings. The documentation page [21] covers most of the information you need to know (though we have already used these...