Configuring PySpark installation with deep learning packages
There are some additional configurations that need to be done within PySpark to implement deep learning packages from Databricks called spark-deep-learning
. These are configurations that were made all the way back in chapter 1, Setting up your Spark Environment for Deep Learning.
Getting ready
This configuration requires making changes in the terminal, using bash.
How to do it...
The following section walks through the steps to configure PySpark with deep learning packages:
- Open the terminal application and type in the following command:
nano .bashrc.
- Scroll all the way to the bottom of the document and look for the
sparknotebook()
 function we created back in chapter 1, Setting up your Spark Environment for Deep Learning.
- Update the last row of the function. It should currently look like the following:
$SPARK_HOME/bin/pyspark.
Change it to the following:
$SPARK_HOME/bin/pyspark --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2...