In the previous chapter, we discussed how Spark components get deployed on the cluster. It's now time to learn about the advanced concepts of the Spark programming model.
We will talk about the how RDD gets partitioned across the cluster, and some advanced transformations and actions that can be performed on an RDD. We will also discuss cluster-wide variables that can be accessed by various tasks being executed on different executors that are running on different worker nodes.