Loading DataFrames and setup from an external source
In this recipe, we examine data manipulation using SQL. Spark's approach to provide, both a pragmatic and SQL interface works very well in production settings in which we not only require machine learning, but also access to existing data sources using SQL to ensure compatibility and familiarity with existing SQL-based systems. DataFrame with SQL makes for an elegant process toward integration in real-life settings.
How to do it...
- Start a new project in IntelliJ or in an IDE of your choice. Make sure the necessary JAR files are included.
- Set up the package location where the program will reside:
package spark.ml.cookbook.chapter3
- Set up the imports related to DataFrame and the required data structures and create the RDDs as needed for the example:
import org.apache.spark.sql._
- Import the packages for setting up the logging level for
log4j
. This step is optional, but we highly recommend it (change the level appropriately as you move through the...