With deep learning gaining rapid mainstream adoption in modern-day industries, organizations are looking for ways to unite popular big data tools with highly efficient deep learning libraries. This will help deep learning models train with higher efficiency and speed.
With the help of Apache Spark Deep Learning Cookbook, you'll work through specific recipes to generate outcomes for deep learning algorithms without getting bogged down in theory. From setting up Apache Spark for deep learning to implementing types of neural nets, this book tackles both common and not-so-common problems in order to perform deep learning on a distributed environment. In addition to this, you'll get access to deep learning code within Spark that can be reused to answer similar problems or tweaked to answer slightly different problems. You‘ll also learn how to stream and cluster your data with Spark. Once you have got to grips with the basics, you'll explore how to implement and deploy deep learning models such as CNN, RNN, and LSTMs in Spark using popular libraries such as TensorFlow and Keras. At the end of the day, this is a cookbook designed to teach how to practically apply models on Spark, so we will not dive into the theory and math
behind the models used in this chapter, although we will reference where additional
information on each model can be obtained.
By the end of the book, you'll have the expertise to train and deploy efficient deep learning models on Apache Spark.