References
- Training dataset: https://huggingface.co/datasets/imdb
- HF AutoTrain: https://huggingface.co/docs/autotrain/index
- BERT paper: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova, 2019, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding: https://arxiv.org/abs/1810.04805
Join our community on Discord
Join our community’s Discord space for discussions with the author and other readers:
https://packt.link/llm