Getting started with notebooks and jobs in Azure Databricks
In this recipe, we will import a notebook into our workspace and learn how to execute and schedule it using jobs. By the end of this recipe, you will know how to import, create, execute, and schedule Notebooks in Azure Databricks.
Getting ready
Ensure the Databricks cluster is up and running. Clone the cookbook repository from https://github.com/PacktPublishing/Azure-Databricks-Cookbook to any location on your laptop/PC. You will find the required demo files in the chapter-01
folder.
How to do it…
Let's dive into importing the Notebook into our workspace:
- First, let's create a simple Notebook that will be used to create a new job and schedule it.
- In the cloned repository, go to chapter-01. You will find a file called DemoRun.dbc. You can import the dbc file into your workspace by right-clicking the Shared workspace and selecting the Import option: