Search icon CANCEL
Subscription
0
Cart icon
Close icon
You have no products in your basket yet
Save more on your purchases!
Savings automatically calculated. No voucher code required
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Generative AI with Python and TensorFlow 2

You're reading from  Generative AI with Python and TensorFlow 2

Product type Book
Published in Apr 2021
Publisher Packt
ISBN-13 9781800200883
Pages 488 pages
Edition 1st Edition
Languages
Authors (2):
Joseph Babcock Joseph Babcock
Profile icon Joseph Babcock
Raghav Bali Raghav Bali
Profile icon Raghav Bali
View More author details

Table of Contents (16) Chapters

Preface 1. An Introduction to Generative AI: "Drawing" Data from Models 2. Setting Up a TensorFlow Lab 3. Building Blocks of Deep Neural Networks 4. Teaching Networks to Generate Digits 5. Painting Pictures with Neural Networks Using VAEs 6. Image Generation with GANs 7. Style Transfer with GANs 8. Deepfakes with GANs 9. The Rise of Methods for Text Generation 10. NLP 2.0: Using Transformers to Generate Text 11. Composing Music with Generative Models 12. Play Video Games with Generative AI: GAIL 13. Emerging Applications in Generative AI 14. Other Books You May Enjoy
15. Index

Stacking Restricted Boltzmann Machines to generate images: the Deep Belief Network

You have seen that an RBM with a single hidden layer can be used to learn a generative model of images; in fact, theoretical work has suggested that with a sufficiently large number of hidden units, an RBM can approximate any distribution with binary values.19 However, in practice, for very large input data, it may be more efficient to add additional layers, instead of a single large layer, allowing a more "compact" representation of the data.

Researchers who developed DBNs also noted that adding additional layers can only lower the log likelihood of the lower bound of the approximation of the data reconstructed by the generative model.20 In this case, the hidden layer output h of the first layer becomes the input to a second RBM; we can keep adding other layers to make a deeper network. Furthermore, if we wanted to make this network capable of learning not only the distribution of the...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}