ELT with BigQuery, dbt, and Apache Airflow® for eCommerce
The ELT with BigQuery, dbt, and Apache Airflow® GitHub repository is a free and open-source reference architecture showing how to use Apache Airflow® with Google BigQuery and dbt Core to build an end-to-end ELT pipeline. The pipeline ingests data from an eCommerce store's API, loads the data to BigQuery and completes several transformation steps using dbt Core run with Astronomer Cosmos. After the reporting tables are created, a message is sent to a Slack channel listing the current top customers.
This reference architecture was created as a learning tool to demonstrate how to use Apache Airflow to orchestrate data ingestion into object storage and a data warehouse, as well as how to use dbt Core to transform the data in several steps. You can adapt the pipeline for your use case by ingesting data from other sources and adjusting the dbt transformations.
Architecture
This reference architecture consists of 4 main components:
- Extraction: Data is extracted from an eCommerce store's API and stored in a GCS bucket.
- Loading: The extracted data is loaded into BigQuery using BigQuery transfer service.
- Transformation: The data is transformed in several steps using dbt Core orchestrated with Astronomer Cosmos.
- Reporting: Top customers are reported in a Slack message.
Airflow features
The DAGs in this reference architecture highlight several key Airflow best practices and features:
- Astronomer Cosmos: dbt Core transformations are orchestrated using Astronomer's open-source tool Cosmos for full visibility into dbt runs in the Airflow UI.
- Dynamic task mapping: Interaction with files in object storage is parallelized per type of record using dynamic task mapping with custom map indexes.
- Data-driven scheduling: The DAGs in this reference architecture run on data-driven schedules as soon as the data they operate on is updated.
- Task Groups: Tasks in the loading step are grouped together using a task group to make the DAG code more readable.
- Airflow retries: To protect against transient API failures, all tasks are configured to automatically retry after an adjustable delay.
- Custom XCom Backend: In the extraction step, new records are passed through XCom to the next task. XComs are stored in GCS using an Object Storage custom XCom backend.
- Modularization: SQL queries are stored in the
include
folder and imported into the DAG file to be used in tasks using the BigQueryInsertJobOperator. This makes the DAG code more readable and offers the ability to reuse SQL queries across multiple DAGs.
Next Steps
If you'd like to build your own ELT or ETL pipeline with BigQuery, dbt Core, and Apache Airflow®, feel free to fork the repository and adapt it to your use case. We recommend deploying the Airflow pipelines using a free trial of Astro.