Upgrade from Apache Airflow® 2 to 3
Airflow 3 is a major release of Apache Airflow® that includes a completely new UI and significant architectural changes, improving Airflow's security posture and enabling new features. While Airflow developers took great care to keep as much backward compatibility as possible, making the upgrade process as efficient and smooth as it can be, there are some breaking changes that you need to be aware of. Additionally, the Airflow project has tools to help you upgrade your DAG code and Airflow configuration to be compatible with Airflow 3.
This guides provides a checklist for upgrading from Airflow 2 to Airflow 3, including:
- How to check your DAG code for compatibility with Airflow 3
- How to check your Airflow config for compatibility with Airflow 3
- A list of important breaking changes between Airflow 2 and Airflow 3
This guide covers important breaking changes between Airflow 2 and Airflow 3, as well as upgrading instructions for open-source Airflow. Astronomer customers should refer to the Astro documentation for specific upgrade instructions. For a complete list of changes see the Airflow release notes.
Assumed knowledge
To get the most out of this guide, you should have an understanding of:
- Basic Airflow concepts. See Introduction to Apache Airflow.
- Basic knowledge of Airflow components. See Airflow Components.
Upgrade checklist
The following checklist provides a high-level overview of the steps you need to take to upgrade your Airflow 2 environment to Airflow 3. The steps are described in more detail in the sections below.
- Make sure that your current Airflow environment is on at least version 2.6.3 (Astro Runtime 8.7.0). This version is the oldest one from which upgrading to Airflow 3 is possible, due to changes in the Airflow metadata database starting with this version. We recommend upgrading to the latest Airflow 2 version before you migrate to Airflow 3 so that you can benefit from and resolve all deprecation warnings that appear in your logs.
- Use the Airflow ruff rules to check your Airflow DAG code, and make all necessary changes. If you are using the Astro CLI, the
astro dev upgrade-test
command contains ruff check options. - Use the Airflow config linter to check your Airflow config, and make all necessary changes.
- Assess whether you need to make additional changes to the DAGs based on the list of breaking changes.
- Upgrade your local development Airflow environment to Airflow 3. If you are using the Astro CLI:
- Check your Astro CLI version with
astro version
. You need to be at least on version 1.34.0 to run Airflow 3. You can upgrade the Astro CLI withbrew upgrade astro
. - Change the Astro Runtime version in your project's Dockerfile to
FROM astrocrpublic.azurecr.io/runtime:<astro-runtime-version>
(see the Astro Runtime release notes for the latest version available).
- Check your Astro CLI version with
- Run your updated DAGs with Airflow 3 locally to test them.
- Upgrade your production environment to Airflow 3. Astronomer customers should refer to the Astro documentation for upgrade instructions.
- Deploy your updated DAGs to the cloud environment.
If you are still using Airflow 1, we highly recommend upgrading to Airflow 2 as soon as possible. Support for Airflow 1 ended on June 17, 2021, so no further updates are being made, and potential security issues in Airflow 1 are not being addressed. After upgrading to Airflow 2, upgrade to Airflow 2.6.3+; then upgrade to Airflow 3 as explained in this chapter. For information on upgrading from Airflow 1 to Airflow 2, see the Airflow documentation.
Check your DAG code with ruff
In Airflow 3 several deprecated parameters and import paths have been removed. This means that if you have been using deprecated parameters or import paths in your DAG code, you will need to update them to be compatible with Airflow 3. The ruff linter is a Python linter and code transformation tool that can be used to check your Airflow DAG code for compatibility with Airflow 3. There are two sets of ruff rules available for upgrading from Airflow 2 to Airflow 3:
- AIR30: These rules check your Airflow code for removed parameters and imports that are no longer available in Airflow 3. Making these changes is mandatory for your DAGs to work in Airflow 3.
- AIR31: These rules check your Airflow code for deprecated parameters and imports that are still available in Airflow 3, but will be removed in future versions. Making these changes is recommended to ensure that your DAGs will continue to work in future versions of Airflow.
To use the ruff linter, you need to install the latest version of ruff. You can do this with pip:
pip install --upgrade ruff
Then, you can run the ruff linter on your Airflow DAG code with the following command:
ruff check --preview --select AIR30 <path_to_your_dag_code>
You can add --fix
to the command to automatically fix issues that ruff finds, note that not all issues can be fixed automatically.
After running this command you will see a list of issues that ruff found in your code in your terminal, with a suggestion for how to fix them. For example, if you have a DAG that uses the fail_stop
DAG parameter, which was renamed to fail_fast
, you will see an error message like this:
dags/my_dag.py:19:5: AIR301 [*] `fail_stop` is removed in Airflow 3.0
|
17 | start_date=datetime(2025, 1, 1),
18 | schedule="@daily",
19 | fail_stop=True
| ^^^^^^^^^ AIR301
20 | ):
|
= help: Use `fail_fast` instead
Found 1 error.
[*] 1 fixable with the `--fix` option.
If you are using the Astro CLI, the ruff check is included in the astro dev upgrade-test
command. See the Astro CLI documentation for more information.
The ruff linter is a great tool to help you update your DAG code for Airflow 3. However, it cannot detect all potential compatibility issues. After running the ruff linter, you should still read through the breaking changes section of this guide and the Airflow release notes to ensure that your code is compatible with Airflow 3.
Check your Airflow config
In Airflow 3, some changes have been made to configuration options. For upgrading purposes, four categories of changes are relevant:
- Default changes: Some defaults have changed, for example the default for
[scheduler].catchup_by_default
has changed from True to False. - Renamed options: Some options have been renamed and/or moved to another section, such as
[webserver].web_server_host
which has been renamed and moved to[api].host
. - Removed options: Some previously deprecated options have been removed, such as
[webserver].error_logfile
. - Previously vaid options are now invalid: Some options that were previously valid are now invalid. For example,
0
used to be a valid input to[core].parallelism
, but now a positive integer is required.
You can learn more about all valid configuration options in the Airflow configuration reference.
The airflow config lint
command of the Airflow CLI that can be used to check your Airflow configuration for compatibility with Airflow 3 and airflow config update
will make the necessary changes to your configuration file for it to be compatible with Airflow 3.
Astro CLI users first need to export their airflow.cfg
file and potentailly make a change for it to parse through the linter correctly:
- Run
astro dev start
to start your Astro CLI project. - Run
astro dev run config list | awk '/^\[core\]/ {found=1} found' > airflow.cfg
to export your current configuration file. This command removes any additional lines at the beginning of the file that might cause a parsing error in the linter. - Copy the
airflow.cfg
file into your scheduler container withdocker cp airflow.cfg <scheduler container>:/usr/local/airflow
. - Enter the scheduler container with
astro dev bash
. - Run
airflow config lint
inside the container to check your configuration file for compatibility with Airflow 3. - Make the necessary changes to your environment variables related to configuation options. See Environment variables in the Astro documentation for more information on how to set environment variables in your Astro project.
After running the airflow config lint
command, you will see a list of issues that were found in your configuration file like the example below:
Found issues in your airflow.cfg:
- `base_url` configuration parameter moved from `webserver` section to `api` section as `base_url`.
- Removed `error_logfile` configuration parameter from `webserver` section.
Please update your configuration file accordingly.
Breaking changes
Being a new major version, Airflow 3 comes with a number of breaking changes that can affect some of your DAGs, depending on which features you are using. This section lists the most important breaking changes that you need to be aware of when upgrading from Airflow 2 to Airflow 3.
The list of breaking changes in this guide focusses on the most relevant ones but is not exhaustive. For a full list of changes between Airflow 2 and Airflow 3, see the Airflow release notes.
Removed direct metadata database access
In Airflow 2, all tasks had direct access to the Airflow metadata database. This access was removed in Airflow 3, greatly improving Airflow’s security posture. If you are accessing the Airflow metadata database directly in any of your task or trigger code, such as by using the SQLAlchemy connection environment variable, that process will error in Airflow 3. This includes custom operators making such a connection.
Recommendation: Directly accessing the Airflow metadata database from within tasks is an antipattern because it could lead to accidental modifying or dropping of information that is vital to Airflow’s functioning, up to and including corruption of your entire Airflow instance. To interact with and retrieve information about your Airflow instance, use the Airflow REST API instead.
Changes related to scheduling
In Airflow 3, the following changes were made to scheduling parameters and utilities:
schedule_interval
andtimetable
were deprecated in favor ofschedule
. The default schedule is nowNone
.catchup
was set toFalse
by default at the configuration level. This means that if you do not set a value forcatchup
, Airflow will not try to catch up on missed runs. You can enable the Airflow 2 behavior by setting the[scheduler].catchup_by_default
configuration option toTrue
.- The
days_ago
function was removed in favor ofpendulum.today('UTC').add(days=-N, ...)
.
If you pass raw cron strings to your DAG's schedule
, for example 0 0 * * *
, by default it used to be interpreted with the CronDataIntervalTimetable
timetable under the hood. In Airflow 3, this behavior was changed to use the CronTriggerTimetable
timetable instead. You can change this behavior back to the Airflow 2 behavior by setting the [scheduler].create_cron_data_intervals
configuration option to True
. For more information on the differences between the two timetables, see the Timetables comparisons in the Airflow documentation.
The logical_date
attribute of the DAG run was changed from being equivalent to the data_interval_start
in Airflow 2 to being equivalent to the run_after
date in Airflow 3. This means that the logical_date
is now equivalent to the run_after
date and the run_id
takes its timestamp from the moment in time when the DAG run is queued. It is also now possible to pass None
as the logical_date
. The deprecated execution_date
attribute was removed. This change is mostly relevant users that utilitze a time-dependent context element in the logic of their DAGs, for example to partition their data in a SQL query. See Schedule DAGs in Apache Airflow® for more information.
Other changes
Airflow 3 introduces other improvements and changes that may affect you if you use any of the related features. The following list summarizes the most important ones:
- Airflow 3 uses a new
v2
version of the Airflow REST API. If you are using the REST API to interact with Airflow you'll likely need to update your scripts. See the Airflow REST API documentation for more information. - There were changes to the Airflow context that included the removal of deprecated keys such as
execution_date
. See the Airflow context guide for more information. - The long deprecated subdags feature was removed. If you are using subdags in your DAGs, you will need to refactor them to use task groups instead.
- The SLA feature of Airflow 2 was removed. Astro customers should use Astro Alerts for simple SLAs and Astro Observe for advanced SLAs instead. For open source Airflow users, a new deadline alerts feature is planned for a future release.
- Given the new React-based U, Flask-AppBuilder (FAB) was removed in Airflow 3. The default auth manager was changed to
SimpleAuthManager
. If you need FAB integration, install the FAB provider. For more information on auth managers, see the Airflow documentation. Support for FAB-based plugins is limited in Airflow 3.0 but will be available in a future release. - The deprecated Astro Python SDK package is not compatible with Airflow 3.
- XCom pickling is no longer allowed when using the default XCom backend in Airflow 3 for security reasons. Existing pickled XComs are moved to an archive table. Use a custom XCom backend with custom serialization to pass data between tasks that Airflow cannot serialize by default.
- The email/SMTP integration in Airflow core was deprecated and will be removed in Airflow 4. For more information on how to set up email notifications see the Manage Apache Airflow® DAG notifications guide.
- Python 3.8 is no longer supported in Airflow 3. You need to be on Python 3.9 or higher to run Airflow 3.
- PostgreSQL 12 is no longer supported in Airflow 3. You need to be on PostgreSQL 13 or higher to run Airflow 3.
Depending on how you run Airflow, you may find that some Airflow providers (such as FTP, HTTP, and IMAP) that used to be preinstalled in your image/package for Airflow 2 are not preinstalled in Airflow 3. Pip-install the needed providers in your Airflow environment. See the Airflow documentation for a list of officially supported providers.