Passing task and job parameters within a Databricks Workflow
Parameters are a convenient way to run the same code with different values in Databricks. When you create a task in a Databricks Workflow, you can set parameters for that task only. Job-level parameters are linked to the job, so they have the same runtime value for all tasks in the job that use them. If you change the job parameters at runtime, all tasks will use the new value. However, the default values of the parameters are specific to each task, so they can be different even if the parameter name is the same. This does not matter if you use task parameters, as they are not shared with other tasks.
In this recipe, we will learn how to pass task and job parameters within Workflows and share information from one job task to another.
How to do it...
We will be looking at two ways to share parameters in Databricks Workflows:
- Passing task parameters to tasks within a workflow
- Share information from one...