Apache Kafka Fundamentals

1h 4m 52s
English
Paid

Course description

In this course, you will master the basic knowledge necessary for a confident start with **Apache Kafka**. You will learn how to configure a message queue, write producers and consumers, and understand how Kafka fits into the architecture of data and event processing platforms. After completing this course, you will easily be able to work with Kafka and understand the operation of similar cloud tools.
Read more about the course

1. Fundamentals of Kafka and Message Queues

You will understand what Kafka is and how it is used in stream and event processing systems. You will learn about the key components of Kafka: topics, messages, consumer groups, and how they interact with each other. You will also learn how a message queue works, how data is written and read from it, and why message order and delivery guarantees are important.

2. Apache Kafka Architecture

We will delve into the key elements of Kafka's architecture. You will learn what topic partitions are, how they relate to brokers, and how data processing takes place within Kafka. I will explain what Zookeeper is, the role it plays, and how it interacts with Kafka brokers and metadata.

3. Setting Up the Development Environment

You will learn how to run Kafka on a Windows environment using Docker. I will provide a step-by-step guide on how to set up a Bitnami Kafka Docker container, and give practical tips for successful installation and launching of the environment.

4. Practicing with Kafka

You will set up your own Kafka topic, master basic commands to work with it. You will also create a producer to write messages and a consumer to read them. We will test their operation using Python and learn to manage consumer offsets using the offset checker.

5. Kafka in Data Processing Platforms

In conclusion, we will explore how Kafka can be integrated into a Data Science platform. You will see three practical scenarios of using Kafka:

  • ETL ingest pipeline
  • Multiple consumer processes
  • Multistage stream processing

These examples will help you start implementing Kafka in your everyday work today.

Watch Online

Join premium to watch
Go to premium
# Title Duration
1 Introduction 02:16
2 What is Kafka 09:15
3 Basic Kafka Parts 04:25
4 Message Queue Basics 07:39
5 Topics Partitions & Brokers 02:16
6 Brokers & Zookeeper 04:40
7 Development Environment 02:42
8 Bitnami Docker Setup 03:34
9 Basic Topic Commands 04:16
10 Kafka Producer 05:56
11 Kafka Consumer 01:35
12 Testing Producer & Consumer 02:45
13 Working with Consumer Offsets 06:46
14 Examples How Kafka Fits in Data Platforms 05:25
15 Conclusion 01:22

Similar courses

Data Engineering on GCP

Data Engineering on GCP

Sources: Andreas Kretz
Google Cloud Platform (GCP) is one of the most popular cloud platforms in the world, providing an extensive set of tools and services for building...
1 hour 17 minutes 33 seconds
Snowflake for Data Engineers

Snowflake for Data Engineers

Sources: Andreas Kretz
Snowflake is a next-generation cloud data warehouse that everyone is talking about today. The platform operates 100% in the cloud, providing flexible access...
2 hours 4 minutes 8 seconds
Data Analysis for Beginners: Python & Statistics

Data Analysis for Beginners: Python & Statistics

Sources: zerotomastery.io
This course is your first step into the world of data analysis using one of the main tools for analysts - Python. Without complicated terms, advanced...
6 hours 34 minutes 20 seconds
Azure Data Pipelines with Terraform

Azure Data Pipelines with Terraform

Sources: Andreas Kretz
Azure is becoming an increasingly popular platform for companies using the Microsoft365 ecosystem. If you want to enhance your data engineering skills...
4 hours 20 minutes 29 seconds
Relational Data Modeling

Relational Data Modeling

Sources: Eka Ponkratova
Relational modeling is widely used in building transactional databases. You might say, "But I'm not planning to become a backend engineer."
1 hour 52 minutes