Checking the execution details of all the executed Spark queries via the Spark UI
In this recipe, you will learn how to view the statuses of all the running applications in your cluster. You will also look at various tasks you can use to identify if there are any issues with a specific application/query. Knowing this is useful as you will get holistic information about how your cluster is utilized in terms of tasks distribution and how your applications are running.
Getting ready
Execute the queries shown in the Introduction to jobs, stages, and tasks recipe of this chapter. You can either use a Spark 2.x (latest version) or Spark 3.x cluster.
How to do it…
Follow these steps to learn about the running applications/queries in your cluster:
- When you are in your Databricks workspace, click on the Clusters option and then on the cluster that you are using. Then, click on the Spark UI tab, as shown in the following screenshot: