ETL and ELT are some of the most common data engineering use cases, and are considered the bread and butter of Airflow. Because Airflow is 100% code, knowing the basics of Python is all it takes to get started writing pipelines that provide the data your team needs for any downstream application. However, working with ETL and ELT pipelines in production often comes with challenges such as scaling, connectivity to other systems, and dynamically adapting to changing data sources. Fortunately, Airflow has all of this covered, and we’re here to help you learn how to make Airflow work best for this use case.In this webinar, we cover DAG writing best practices applicable to ETL and ELT pipelines. You can find the code shown in the demo here.