Deep learning is a subset of machine learning based on multilayer neural networks that can solve particularly hard and large-scale problems in areas such as natural language processing and image classification. This book addresses the sheer complexity of the technical and analytical parts, and the speed at which deep learning solutions can be implemented on top of Apache Spark.
The book starts with an explanation of the fundamentals of Apache Spark and deep learning (how to set up Spark for deep learning, the principles of distributed modeling, and different types of neural network). Then it moves to the implementation of some deep learning models, such as CNNs, RNNs, and LSTMs, on Spark. The readers will get hands-on experience of what it takes and a general feeling of the complexity of what they are dealing with. During the course of the book, popular deep learning frameworks such as DeepLearning4J (mostly), Keras, and TensorFlow will be used to implement and train distributed models.
The mission of this book is as follows:
- To create a hands-on guide to implementing Scala (and in some cases, Python too) deep learning solutions that scale and perform
- To make readers confident with using Spark via several code examples
- To explain how to choose the model that best addresses a particular deep learning problem or scenario