Light Mode
Exploring
Apache Airflow® 3

Coming soon to a city near you!
Get up to speed on Airflow 3, the most transformative release in Airflow’s history. You’ll get a deep dive into key features of Airflow including: DAG Versioning, native backfill management and improved support for AI, ML & event-driven workloads. With these innovations, Airflow 3 redefines orchestration, offering more flexibility and control than ever before.
Join us to explore how these new capabilities can transform the way you automate and operationalize data workflows!
Find Your Roadshow

May 21 | 10:30am - 4:30pm
London
Andaz London Liverpool Street
40 Liverpool Street, London, EC2M 7QN, UK

June 25 | 10:30am - 4:30pm
New York
Classic Car Club Manhattan
1 PIER 76, 408 12th Ave, New York, NY

July 10 | 10:30am - 4:30pm
Sydney
Doltone House Hyde Park
Level 3, 181 Elizabeth St, NSW 2000


What to Expect
Unlock the Future of Data Orchestration
Airflow 3 takes workflow automation to the next level with greater flexibility, scalability, and security. Discover powerful new capabilities that simplify pipeline management and enhance execution.
The Real-World Impact of Airflow 3
DAG Versioning, streamlined backfills, and real-time triggers. It's everything you love about Airflow—supercharged. Learn how organizations are leveraging these features to transform their data operations.
Airflow 3 Workshop
Get hands-on with Airflow's powerful new capabilities. Master features like improved task execution, enhanced observability, and security updates to supercharge your workflow orchestration – all while ensuring a smooth transition to Airflow 3.
FAQs
When is the release schedule for Airflow 3.0?
What are the additional languages that will be supported in Airflow 3.0?
How does the new architecture in Airflow 3.0 balance distributed execution with control?
Presented by Astronomer
Explore the best way to develop and deliver data products.
Supercharge your development.
Empower engineers from multiple teams to efficiently build, test, and deploy data products on Airflow, even if they lack Python skills.
Keep critical applications running.
Ensure reliable, elastic, secure, and multi-tenant data product delivery across hybrid and multi-cloud environments, with detailed reporting.
Get complete visibility into your data pipelines.
A single pane of glass to govern and optimize the data product lifecycle with full lineage, alerting, and proactive recommendations.