- Introduction
-
Welcome! 0 hr 1 min
-
What's a DAG? 0 hr 1 min
- Your first data pipeline
-
Define your DAG 0 hr 3 min
-
Taskflow: Define your DAG object 0 hr 2 min
-
Define your first task 0 hr 2 min
-
The context manager 0 hr 1 min
-
Taskflow: Define your first task 0 hr 2 min
-
Define dependencies 0 hr 4 min
-
The default arguments 0 hr 1 min
-
Taskflow: Define dependencies 0 hr 2 min
-
Practice: Creating your second data pipeline
- Finishing up...
-
Quiz!
-
Wrap Up
-
How was it?
Airflow: DAGs 101
Learn the basics of how to create a data pipeline in Airflow.
Welcome! We're so glad you're here 😍
Creating data pipelines doesn't have to be difficult in Airflow.
It's actually pretty easy.
They are just a few things to know so you're all settled up to create reliable data pipelines following best practices.
🎯Objectives
At the end of this course, you'll be able to:
- Create a data pipeline from scratch
- Describe what parameters always to define and why
- Create your first tasks with the PythonOperator
- Define dependencies between tasks
👥 Audience
Who should take this course:
- Data Engineers
- Data Analysts
- Software Engineers
- Data Scientists
Set aside 17 minutes to complete the course.
💻 Setup Requirements
You need to have the following:
- Docker and Docker compose on your computer (cf: get Docker)
- The Astro CLI
- Access to a web browser