With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production.

This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.

  • Get an overview of need-to-know foundational Airflow concepts
  • Create your first Airflow project in a local development environment
  • Write your first DAG
  • Deploy your DAG to the cloud

Get Your Copy Today


By proceeding you agree to our Privacy Policy,
our Website Terms and to receive emails from Astronomer.

Build, run, & observe your data workflows.
All in one place.

Get $300 in free credits during your 14-day trial.

Get Started Free