References
- Richard Socher, Alex Perelygin, Jean Wu, Jason Chuang, Christopher Manning, Andrew Ng, and Christopher Potts, Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank: https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf
- Hugging Face pipelines, models, and documentation: https://huggingface.co/transformers/main_classes/pipelines.html
- https://huggingface.co/models
- https://huggingface.co/transformers/
- Yinhan Liu, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, Veselin Stoyanov, 2019, RoBERTa: A Robustly Optimized BERT Pretraining Approach: https://arxiv.org/pdf/1907.11692.pdf
- The Allen Institute for AI: https://allennlp.org/
- The Allen Institute for reading comprehension resources: https://demo.allennlp.org/sentiment-analysis
- RoBERTa-large contribution, Zhaofeng Wu: https://zhaofengwu.github.io/
- The Stanford Sentiment Treebank: https://nlp.stanford.edu/sentiment/treebank.html