Creating RDDs with Spark 2.0 using external data sources
In this recipe, we provide you with several examples to demonstrate RDD creation using external sources.
How to do it...
- Start a new project in IntelliJ or in an IDE of your choice. Make sure the necessary JAR files are included.
- Set up the package location where the program will reside:
package spark.ml.cookbook.chapter3
- Import the necessary packages:
import breeze.numerics.pow import org.apache.spark.sql.SparkSession import Array._
- Import the packages for setting up logging level for
log4j
. This step is optional, but we highly recommend it (change the level appropriately as you move through the development cycle).
import org.apache.log4j.Logger import org.apache.log4j.Level
- Set up the logging level to warning and error to cut down on output. See the previous step for package requirements.
Logger.getLogger("org").setLevel(Level.ERROR) Logger.getLogger("akka").setLevel(Level.ERROR)
- Set up the Spark context and application parameter so...