Chapter 15, From NLP to Task-Agnostic Transformer Models
- Reformer transformer models don’t contain encoders. (True/False)
False. Reformer transformer models contain encoders.
- Reformer transformer models don’t contain decoders. (True/False)
False. Reformer transformer models contain encoders and decoders.
- The inputs are stored layer by layer in Reformer models. (True/False)
False. The inputs are recomputed at each level, thus saving memory.
- DeBERTa transformer models disentangle content and positions. (True/False)
True.
- It is necessary to test the hundreds of pretrained transformer models before choosing one for a project. (True/False)
True and False. You can try all of the models, or you can choose a very reliable model and implement it to fit your needs.
- The latest transformer model is always the best. (True/False)
True and false. A lot of research...