Writing and executing our first Spark program
In this section, we will install/configure and write our first Spark program in Java and Scala.
Hardware requirements
Spark supports a variety of hardware and software platforms. It can be deployed on commodity hardware and also supports deployments on high-end servers. Spark clusters can be provisioned either on cloud or on-premises. Though there is no single configuration or standard, which can guide us through the requirements of Spark, to create and execute Spark examples provided in this book, it would be good to have a laptop/desktop/server with the following configuration:
RAM: 8 GB.
CPU: Dual core or Quad core.
DISK: SATA drives with a capacity of 300 GB to 500 GB with 15 k RPM.
Operating system: Spark supports a variety of platforms that include various flavors of Linux (Ubuntu, HP-UX, RHEL, and many more) and Windows. For our examples, we will recommend that you use Ubuntu for the deployment and execution of examples.
Spark core is coded...