Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Codeless Deep Learning with KNIME

You're reading from   Codeless Deep Learning with KNIME Build, train, and deploy various deep neural network architectures using KNIME Analytics Platform

Arrow left icon
Product type Paperback
Published in Nov 2020
Publisher Packt
ISBN-13 9781800566613
Length 384 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
Kathrin Melcher Kathrin Melcher
Author Profile Icon Kathrin Melcher
Kathrin Melcher
KNIME AG KNIME AG
Author Profile Icon KNIME AG
KNIME AG
Rosaria Silipo Rosaria Silipo
Author Profile Icon Rosaria Silipo
Rosaria Silipo
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Section 1: Feedforward Neural Networks and KNIME Deep Learning Extension
2. Chapter 1: Introduction to Deep Learning with KNIME Analytics Platform FREE CHAPTER 3. Chapter 2: Data Access and Preprocessing with KNIME Analytics Platform 4. Chapter 3: Getting Started with Neural Networks 5. Chapter 4: Building and Training a Feedforward Neural Network 6. Section 2: Deep Learning Networks
7. Chapter 5: Autoencoder for Fraud Detection 8. Chapter 6: Recurrent Neural Networks for Demand Prediction 9. Chapter 7: Implementing NLP Applications 10. Chapter 8: Neural Machine Translation 11. Chapter 9: Convolutional Neural Networks for Image Classification 12. Section 3: Deployment and Productionizing
13. Chapter 10: Deploying a Deep Learning Network 14. Chapter 11: Best Practices and Other Deployment Options 15. Other Books You May Enjoy

Summary

We have reached the end of this relatively long chapter. Here, we have described three NLP case studies, each one solved by training an LSTM-based RNN applied to a time series prediction kind of problem.

The first case study analyzed movie review texts to extract the sentiment hidden in it. We dealt there with a simplified problem, considering a binary classification (positive versus negative) rather than considering too many nuances of possible user sentiment.

The second case study was language modeling. Training an RNN on a given text corpus with a given style produced a network capable of generating free text in that given style. Depending on the text corpus on which the network is trained, it can produce fairy tales, Shakespearean dialogue, or even rap songs. We showed an example that generates text in fairy tale style. The same workflows can be easily extended with more success to generate rap songs (R. Silipo, AI generated rap songs, CustomerThink, 2019, https:...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime