Chapter 2, Getting Started with the Architecture of the Transformer Model
- NLP transduction can encode and decode text representations. (True/False)
True. NLP is transduction that converts sequences (written or oral) into numerical representations, processes them, and decodes the results back into text.
- Natural Language Understanding (NLU) is a subset of Natural Language Processing (NLP). (True/False)
True.
- Language modeling algorithms generate probable sequences of words based on input sequences. (True/False)
True.
- A transformer is a customized LSTM with a CNN layer. (True/False)
False. A transformer does not contain an LSTM or a CNN at all.
- A transformer does not contain LSTM or CNN layers. (True/False)
True.
- Attention examines all the tokens in a sequence, not just the last one. (True/False)
True.
- A transformer does not use positional encoding...