Apache Airflow Workflow Orchestration

1h 18m 41s
English
Paid

Course description

Apache Airflow is a platform-independent tool for workflow orchestration that provides extensive capabilities for creating and monitoring both streaming and batch pipelines. Even the most complex processes are easily implemented with its help—all with the support of key platforms and tools in the world of Data Engineering, including AWS, Google Cloud, and others.

Airflow not only allows for scheduling and managing processes but also tracking job execution in real-time, as well as quickly identifying and resolving errors.

In brief: today, Airflow is one of the most in-demand and "hyped" tools in the field of pipeline orchestration. It is actively used by companies worldwide, and knowledge of Airflow is becoming an important skill for any data engineer. This is especially relevant for students starting their journey in this field.

Read more about the course

Basic Concepts of Airflow

Introduction to the fundamentals of working with Airflow: you will learn how DAGs (Directed Acyclic Graphs) are created, what they consist of (operators, tasks), and how the architecture of Airflow is structured - including the database, scheduler, and web interface. We will also look at examples of event-driven pipelines that can be implemented using Airflow.

Installation and Environment Setup

In practice, you will work on a project dealing with weather data processing. The DAG will fetch data from a weather API, transform it, and store it in a Postgres database. You will learn how to:

  • configure the environment using Docker;
  • verify the web interface and container operations;
  • configure the API and create the necessary tables in the database.

Practice: Creating DAGs

You will thoroughly understand the Airflow interface and learn to monitor task statuses. Then you will:

  • create DAGs based on Airflow 2.0 that retrieve and process data;
  • master the Taskflow API - a modern approach to building DAGs with more convenient syntax;
  • implement parallel task execution (fanout) to run multiple processes simultaneously.

Watch Online

This is a demo lesson (10:00 remaining)

You can watch up to 10 minutes for free. Subscribe to unlock all 21 lessons in this course and access 10,000+ hours of premium content across all courses.

View Pricing

Watch Online Apache Airflow Workflow Orchestration

0:00
/
#1: Introduction

All Course Lessons (21)

#Lesson TitleDurationAccess
1
Introduction Demo
01:37
2
Airflow Usage
03:20
3
Fundamental Concepts
02:48
4
Airflow Architecture
03:10
5
Example Pipelines
04:50
6
Spotlight 3rd Party Operators
02:18
7
Airflow XComs
04:33
8
Project Setup
01:44
9
Docker Setup Explained
02:07
10
Docker Compose & Starting Containers
04:24
11
Checking Services
01:49
12
Setup WeatherAPI
01:34
13
Setup Postgres DB
01:59
14
Airflow Webinterface
04:38
15
Creating DAG With Airflow 2.0
09:47
16
Running our DAG
04:16
17
Creating DAG With TaskflowAPI
07:00
18
Getting Data From the API With SimpleHTTPOperator
03:39
19
Writing into Postgres
04:13
20
Parallel Processing
04:16
21
Recap & Outlook
04:39

Unlock unlimited learning

Get instant access to all 20 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.

Learn more about subscription

Comments

0 comments

Want to join the conversation?

Sign in to comment

Similar courses

Case Study in A/B Testing

Case Study in A/B Testing

Sources: LunarTech
Examples from practice in A/B testing - this course will introduce you to the methods of designing, conducting, and analyzing experiments using A/B...
1 hour 56 minutes 17 seconds
Data Analysis for Beginners: Python & Statistics

Data Analysis for Beginners: Python & Statistics

Sources: zerotomastery.io
This course is your first step into the world of data analysis using one of the main tools for analysts - Python. Without complicated terms, advanced...
6 hours 34 minutes 20 seconds
Spark and Python for Big Data with PySpark

Spark and Python for Big Data with PySpark

Sources: udemy
Learn the latest Big Data Technology - Spark! And learn to use it with one of the most popular programming languages, Python! One of the most valuable technolog
10 hours 35 minutes 43 seconds
Data Engineering on Azure

Data Engineering on Azure

Sources: Kristijan Bakarić
Microsoft Azure is a cloud platform offering more than 200 products and services for data storage, management, virtual machine deployment, and...
1 hour 20 minutes 57 seconds
Data Platform & Pipeline Design

Data Platform & Pipeline Design

Sources: Andreas Kretz
Data pipelines are a key component of any Data Science platform. Without them, data loading and machine learning model deployment would not be possible. This...
1 hour 59 minutes 5 seconds