In this ELT tutorial, you will leverage Apache Airflow® to copy synthetic data from an S3 bucket into a Databricks table and then run several Databricks notebooks as a Databricks job created by Airflow to analyze the data.
By the end of this demo, you’ll be able to extract, load, and transform data with Apache Airflow and Databricks. You can also watch the process here.
Get Your Copy Today
By proceeding you agree to our Privacy Policy,
our Website Terms and to receive emails from Astronomer.