Chapter 9:Cross-Lingual and Multilingual Language Modeling
Up to this point, you have learned a lot about transformer-based architectures, from encoder-only models to decoder-only models, from efficient transformers to long-context transformers. You also learned about semantic text representation based on a Siamese network. However, we discussed all these models in terms of monolingual problems. We assumed that these models just understand a single language and are not capable of having a general understanding of text, regardless of the language itself. In fact, some of these models have multilingual variants; Multilingual Bidirectional Encoder Representations from Transformers (mBERT), Multilingual Text-to-Text Transfer Transformer (mT5), and Multilingual Bidirectional and Auto-Regressive Transformer (mBART), to name but a few. On the other hand, some models are specifically designed for multilingual purposes trained with cross-lingual objectives. For example, Cross-lingual Language...