Building standalone applications
This recipe explains how to develop and build Spark standalone applications using programming languages such as Scala, Java, Python, and R. The sample application under this recipe is written in Scala.
Getting ready
Install any IDE tool for application development (the preferred one is Eclipse). Install the SBT build tool to build the project. Create the Scala project and add all the necessary libraries to the build.sbt
file. Add this project to Eclipse. SBT is a build tool like Maven for Scala projects.
How to do it…
- Develop a Spark standalone application using the Eclipse IDE as follows:
import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf object SparkContextExample { def main(args: Array[String]) { val file="hdfs://namenode:9000/stocks.txt" val conf = new SparkConf().setAppName("Counting Lines").setMaster("spark://master:7077") val sc = new SparkContext(conf) val data = sc.textFile(file, 2) val totalLines = data.count() println("Total number of Lines: %s".format(totalLines))}}
- Now go to the project directory and build the project using
sbt assembly
andsbt package
manually or build it using eclipse:~/SparkProject/ SparkContextExample/sbt assembly ~/SparkProject/ SparkContextExample/sbt package
How it works…
sbt assembly
compiles the program and generates the JAR as SparkContextExample-assembly-<version>.jar
. The sbt package
generates the jar as SparkContextExample_2.10-1.0.jar
. Both the jars are generated in the path ~/SparkProject/SparkContextExample/target/scala-2.10
. Submit SparkContextExample-assembly-<version>.jar
to the Spark cluster using the spark-submit
shell script under the bin
directory of SPARK_HOME
.
There's more…
We can develop a variety of complex Spark standalone applications to analyze the data in various ways. When working with any third-party libraries, include the corresponding dependency jars in the build.sbt
file. Invoking sbt update
will download the respective dependencies and will include them in the project classpath.
See also
The Apache Spark documentation covers how to build standalone Spark applications. Please refer to this documentation page: https://spark.apache.org/docs/latest/quick-start.html#self-contained-applications.