Using DataFrames with standard SQL language - SparkSQL
In this recipe, we demonstrate how to use DataFrame SQL capabilities to perform basic CRUD operations, but there is nothing limiting you from using the SQL interface provided by Spark to any level of sophistication (that is, DML) desired.
How to do it...
- Start a new project in IntelliJ or in an IDE of your choice. Make sure the necessary JAR files are included.
- Set up the package location where the program will reside
package spark.ml.cookbook.chapter3
- Set up the imports related to DataFrames and the required data structures and create the RDDs as needed for the example
import org.apache.spark.sql._
- Import the packages for setting up the logging level for
log4j
. This step is optional, but we highly recommend it (change the level appropriately as you move through the development cycle).
import org.apache.log4j.Logger import org.apache.log4j.Level
- Set up the logging level to warning and
ERROR
to cut down on output. See the previous step for package...