Chapter 1, Getting Started with the Model Architecture of the Transformer
- NLP transduction can encode and decode text representations. (True/False)
True. NLP is transduction that converts sequences (written or oral) into numerical representations, processes them, and decodes the results back into text.
- Natural Language Understanding (NLU) is a subset of Natural Language Processing (NLP). (True/False)
True.
- Language modeling algorithms generate probable sequences of words based on input sequences. (True/False)
True.
- A transformer is a customized LSTM with a CNN layer. (True/False)
False. A transformer does not contain an LSTM or a CNN at all.
- A transformer does not contain an LSTM or CNN layers. (True/False)
True.
- Attention examines all of the tokens in a sequence, not just the last one. (True/False)
True.
- A transformer uses a positional vector, not positional encoding. (True/False)
False. A transformer...