Next steps
There is no easy way to implement question-answering or shortcuts. We began to implement methods that could generate questions automatically. Automatic question generation is a critical aspect of NLP.
More transformer models need to be pretrained with multi-task datasets containing NER, SRL, and question-answering problems to solve. Project managers also need to learn how to combine several NLP tasks to help solve a specific task, such as question-answering.
Coreference resolution could have been a good contribution to help our model identify the main subjects in the sequence we worked on. This result produced with AllenNLP
shows an interesting analysis:
Figure 10.8: Coreference resolution of a sequence
We could continue to develop our program by adding the output of coreference resolution:
Set0={'Los Angeles', 'the city,' 'LA'}
Set1=[Jo and Maria, their, they}
We could add coreference resolution as a pretraining...