Chapter 7, Applying Transformers to Legal and Financial Documents for AI Text Summarization
- T5 models only have encoder stacks like BERT models. (True/False)
False.
- T5 models have both encoder and decoder stacks. (True/False)
True.
- T5 models use relative positional encoding, not absolute positional encoding. (True/False)
True.
- Text-to-text models are only designed for summarization. (True/False)
False.
- Text-to-text models apply a prefix to the input sequence that determines the NLP task. (True/False)
True.
- T5 models require specific hyperparameters for each task. (True/False)
False.
- One of the advantages of text-to-text models is that they use the same hyperparameters for all NLP tasks. (True/False)
True.
- T5 transformers do not contain a feedforward network. (True/False)
False.
- NLP text summarization works for any text. (True/False)
False.
- Hugging Face is a framework...