Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Python Deep Learning Projects

You're reading from   Python Deep Learning Projects 9 projects demystifying neural network and deep learning models for building intelligent systems

Arrow left icon
Product type Paperback
Published in Oct 2018
Publisher Packt
ISBN-13 9781788997096
Length 472 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (3):
Arrow left icon
Rahul Kumar Rahul Kumar
Author Profile Icon Rahul Kumar
Rahul Kumar
Matthew Lamons Matthew Lamons
Author Profile Icon Matthew Lamons
Matthew Lamons
Abhishek Nagaraja Abhishek Nagaraja
Author Profile Icon Abhishek Nagaraja
Abhishek Nagaraja
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Building Deep Learning Environments 2. Training NN for Prediction Using Regression FREE CHAPTER 3. Word Representation Using word2vec 4. Building an NLP Pipeline for Building Chatbots 5. Sequence-to-Sequence Models for Building Chatbots 6. Generative Language Model for Content Creation 7. Building Speech Recognition with DeepSpeech2 8. Handwritten Digits Classification Using ConvNets 9. Object Detection Using OpenCV and TensorFlow 10. Building Face Recognition Using FaceNet 11. Automated Image Captioning 12. Pose Estimation on 3D models Using ConvNets 13. Image Translation Using GANs for Style Transfer 14. Develop an Autonomous Agent with Deep R Learning 15. Summary and Next Steps in Your Deep Learning Career 16. Other Books You May Enjoy

Sequence-to-sequence models

In this section, we'll implement a seq2seq model (an encoder-decoder RNN), based on the LSTM unit, for a simple sequence-to-sequence question-answer task. This model can be trained to map an input sequence (questions) to an output sequence (answers), which are not necessarily of the same length as each other.

This type of seq2seq model has shown impressive performance in various other tasks such as speech recognition, machine translation, question answering, Neural Machine Translation (NMT), and image caption generation.

The following diagram helps us visualize our seq2seq model:

The illustration of the sequence to sequence (seq2seq) model. Each rectangle box is the RNN cell in which blue ones are the encoders and Red been the Decoders.

In the encoder-decoder structure, one RNN (blue) encodes the input sequence. The encoder emits the context C...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime