Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Generative AI with Python and TensorFlow 2

You're reading from   Generative AI with Python and TensorFlow 2 Create images, text, and music with VAEs, GANs, LSTMs, Transformer models

Arrow left icon
Product type Paperback
Published in Apr 2021
Publisher Packt
ISBN-13 9781800200883
Length 488 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Raghav Bali Raghav Bali
Author Profile Icon Raghav Bali
Raghav Bali
Joseph Babcock Joseph Babcock
Author Profile Icon Joseph Babcock
Joseph Babcock
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. An Introduction to Generative AI: "Drawing" Data from Models 2. Setting Up a TensorFlow Lab FREE CHAPTER 3. Building Blocks of Deep Neural Networks 4. Teaching Networks to Generate Digits 5. Painting Pictures with Neural Networks Using VAEs 6. Image Generation with GANs 7. Style Transfer with GANs 8. Deepfakes with GANs 9. The Rise of Methods for Text Generation 10. NLP 2.0: Using Transformers to Generate Text 11. Composing Music with Generative Models 12. Play Video Games with Generative AI: GAIL 13. Emerging Applications in Generative AI 14. Other Books You May Enjoy
15. Index

NLP 2.0: Using Transformers to Generate Text

As we saw in the previous chapter, the NLP domain has seen some remarkable leaps in the way we understand, represent, and process textual data. From handling long-range dependencies/sequences using LSTMs and GRUs to building dense vector representations using word2vec and friends, the field in general has seen drastic improvements. With word embeddings becoming almost the de facto representation method and LSTMs as the workhorse for NLP tasks, we were hitting some roadblocks in terms of further enhancement. This setup of using embeddings with LSTM made the best use of encoder-decoder (and related architectures) style models.

We saw briefly in the previous chapter how certain improvements were achieved due to the research and application of CNN-based architectures for NLP use cases. In this chapter, we will touch upon the next set of enhancements that led to the development of current state-of-the-art transformer architectures...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at R$50/month. Cancel anytime