YoVDO

Create Complex DAGs and Task Dependencies with Apache Airflow

Offered By: Pluralsight

Tags

Apache Airflow Courses Data Validation Courses Data Pipelines Courses

Course Description

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn to build complex data pipelines with Apache Airflow. This course will teach you to create DAGs that read, validate, aggregate, and load data while managing task dependencies, using XComs for passing data between tasks.

Apache Airflow excels in making it easy for you to manage complex data pipelines. In this course, Create Complex DAGs and Task Dependencies with Apache Airflow, you’ll gain the ability to design and implement intricate workflows in Apache Airflow. First, you’ll explore how to create a DAG that reads a CSV file from a local directory using a BashOperator to check if the file exists. Next, you’ll discover how to perform data validation and aggregation using PythonOperators. Finally, you’ll learn how to load the transformed data using a SQLiteOperator and set up task dependencies with the bitshift operator and set_upstream()/set_downstream() methods, controlling the flow of execution by passing data between tasks using XComs. When you’re finished with this course, you’ll have the skills and knowledge of Apache Airflow needed to create and manage complex DAGs and task dependencies efficiently.

Syllabus

  • Course Overview 1min
  • Managing Task Dependencies in a DAG 21mins

Taught by

Janani Ravi

Related Courses

Introduction to Airflow in Python
DataCamp
Building Data Engineering Pipelines in Python
DataCamp
The Complete Hands-On Introduction to Apache Airflow
Udemy
Apache Airflow: The Hands-On Guide
Udemy
ETL and Data Pipelines with Shell, Airflow and Kafka
IBM via Coursera