Next steps
There is no easy way to implement question-answering or shortcuts. We began to implement methods that could generate questions automatically. Automatic question generation is a critical aspect of NLP.
More transformer models need to be pretrained with multi-task datasets containing NER, SRL, and question-answering problems to solve. Project managers also need to learn how to combine several NLP tasks to help solve a specific task, such as question-answering.
Coreference resolution, https://demo.allennlp.org/coreference-resolution, could have helped our model identify the main subjects in the sequence we worked on. This result produced with AllenNLP
shows an interesting analysis:
![Graphical user interface, text, application Description automatically generated](https://static.packt-cdn.com/products/9781803247335/graphics/Images/B17948_11_08.png)
Figure 11.8: Coreference resolution of a sequence
We could continue to develop our program by adding the output of coreference resolution:
Set0={'Los Angeles', 'the city,' 'LA'}
Set1=[Jo and Maria, their, they}
We could add coreference...