This “Live with Astronomer” session covers how to use the Astro Databricks provider to orchestrate your Databricks Jobs from Airflow. This new provider allows you to monitor your Databricks Jobs from Airflow, write tasks that run Databricks Jobs using the DAG API you’re familiar with, and even send repair requests to Databricks when tasks fail.
Questions covered in this session include:
- How is the Astro Databricks provider different from the original Airflow Databricks provider?
- How do I install the Astro Databricks provider?
- How can I use the DatabricksNotebookOperator to run and monitor Databricks Jobs from Airflow?
- How does the Astro Databricks provider help me recover from failures in my Databricks Jobs?
Learn more about the Astro Databricks Provider and see example code in the official repo.