To assess our knowledge acquired in this chapter, try answering the following questions:
- What is M-BERT?
- How is M-BERT pre-trained?
- What is the effect of word order in M-BERT?
- Define code switching and transliteration.
- How is an XLM model pre-trained?
- How does TLM differ from other pre-training strategies?
- Define FLUE.