Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Exploring Deepfakes

You're reading from   Exploring Deepfakes Deploy powerful AI techniques for face replacement and more with this comprehensive guide

Arrow left icon
Product type Paperback
Published in Mar 2023
Publisher Packt
ISBN-13 9781801810692
Length 192 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Matt Tora Matt Tora
Author Profile Icon Matt Tora
Matt Tora
Bryan Lyon Bryan Lyon
Author Profile Icon Bryan Lyon
Bryan Lyon
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Part 1: Understanding Deepfakes
2. Chapter 1: Surveying Deepfakes FREE CHAPTER 3. Chapter 2: Examining Deepfake Ethics and Dangers 4. Chapter 3: Acquiring and Processing Data 5. Chapter 4: The Deepfake Workflow 6. Part 2: Getting Hands-On with the Deepfake Process
7. Chapter 5: Extracting Faces 8. Chapter 6: Training a Deepfake Model 9. Chapter 7: Swapping the Face Back into the Video 10. Part 3: Where to Now?
11. Chapter 8: Applying the Lessons of Deepfakes 12. Chapter 9: The Future of Generative AI 13. Index 14. Other Books You May Enjoy

Generating text

Recently, text generation models made a major impact when they came into the public consciousness with OpenAI’s success with ChatGPT in 2022. However, text generation was among the first uses of AI. Eliza was the first chatbot ever developed, back in 1966, before all but the most technically inclined people had even seen a computer themselves. The personal computer wouldn’t even be invented for another 5 years, in 1971. However, it’s only recently that truly impressive chatbots have been developed.

Recent developments

A type of model called transformers is responsible for the recent burst in language models. Transformers are neural networks that are comprised entirely of a layer called an attention layer. Attention layers work sort of like a spotlight, focusing on the part of the data that is most likely to be important. This lets transformers (and other models that use attention layers) be a lot deeper without losing “focus”...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image