Skip to main content
CourseFlix

The Complete Apache Kafka Practical Guide

8h 38m 15s
English
Paid

This comprehensive practical guide to Apache Kafka offers extensive hands-on activities designed to deepen your understanding of how Apache Kafka works. By mastering its features, you'll find it easier to utilize and troubleshoot any issues long after completing the course. You don't need any prior knowledge of Apache Kafka, Java, Node.js, or Python, as everything is taught from the ground up. If you're eager to gain in-depth knowledge of Apache Kafka, this course is perfect for you!

Starting with Apache Kafka Installation

We begin by guiding you through the process of installing Apache Kafka on your computer, whether it's a VPS (Virtual Private Server) or a Virtual Machine. You'll discover that installing Apache Kafka is straightforward; simply download the archive containing executable scripts and run them. Moreover, you'll gain experience in running multiple brokers on the same computer.

Hands-on Practice with Apache Kafka

Next, we delve into a variety of practical exercises to explore different Apache Kafka features and built-in scripts. You will set up Zookeeper, multiple Brokers, Console Consumers, and Console Producers. Additionally, you'll test the Kafka Cluster's performance using its built-in utility, the Performance Monitor.

Practical Apache Kafka Activities

  1. Create a cluster with multiple brokers.
  2. Establish topics with multiple partitions across different brokers.
  3. Create topics with a replication factor to store copies of each message on different brokers for redundancy.
  4. Produce messages using the built-in Console Producer.
  5. Consume messages using the built-in Console Consumer.
  6. Launch multiple consumers within the same consumer group.
  7. Utilize the Performance Monitor to test the performance and speed of Consumers and Producers.

Utilizing Apache Kafka API

You'll also learn to use the Apache Kafka API for creating your own Consumers and Producers.

  1. Set up a Java Maven project.
  2. Launch Producers and Consumers using Java.
  3. Launch multiple consumers in the same Consumer Group.
  4. Understand and practice the differences between "subscribe" and "assign".
  5. Create a Node.js project.
  6. Launch Producers and Consumers using Node.js.
  7. Create a Python project.
  8. Launch Producers and Consumers using Python.

Tools and File Management

Throughout the course, you'll work with and edit text files, but you won't use the terminal. Instead, you'll leverage the GUI application Visual Studio Code. All configuration and project files are available in a GitHub repository, providing additional learning on using:

  • Git and GitHub
  • Visual Studio Code

About the Author: udemy

udemy thumbnail
By connecting students all over the world to the best instructors, Udemy is helping individuals reach their goals and pursue their dreams. Udemy is the leading global marketplace for teaching and learning, connecting millions of students to the skills they need to succeed. Udemy helps organizations of all kinds prepare for the ever-evolving future of work. Our curated collection of top-rated business and technical courses gives companies, governments, and nonprofits the power to develop in-house expertise and satisfy employees’ hunger for learning and development.

Watch Online 129 lessons

This is a demo lesson (10:00 remaining)

You can watch up to 10 minutes for free. Subscribe to unlock all 129 lessons in this course and access 10,000+ hours of premium content across all courses.

View Pricing
0:00
/
#1: Apache Kafka Installation Overview
All Course Lessons (129)
#Lesson TitleDurationAccess
1
Apache Kafka Installation Overview Demo
01:07
2
Installing Apache Kafka on the Mac and Unix-like systems
00:53
3
Installing Apache Kafka on the Mac
05:25
4
Installing Ubuntu on MacOS using VirtualBox
12:41
5
SECTION 2 Introduction
01:00
6
Creating remote Ubuntu Virtual Private Server
06:16
7
Installing Apache Kafka on Virtual Private Server
07:05
8
SECTION 3 Introduction
01:03
9
Installing Apache Kafka on Windows
08:26
10
Starting Zookeeper and Kafka server on Windows
06:32
11
Installing Ubuntu on Windows using VirtualBox
11:39
12
Installing Apache Kafka on Ubuntu using GUI
06:16
13
SECTION 4 Introduction
01:03
14
Observing contents of the Kafka folder
05:56
15
Reading and editing Kafka files using VisualStudio Code
06:20
16
Trying to start Kafka Server
04:11
17
Observing Kafka Server logs
01:43
18
Starting Zookeeper
03:12
19
Starting Kafka Server while Zookeeper is up and running
06:12
20
Observing logs folder and current kafka server setup
04:44
21
SECTION 5 Introduction
00:30
22
How to connect to Kafka cluster
02:19
23
Create new Kafka topic
05:05
24
What happened after creation of the new topic
03:36
25
Read details about topic
05:22
26
SECTION 6 Introduction
00:44
27
Send some messages using Kafka Console Producer
03:13
28
Consuming messages using Kafka Console Consumer
03:19
29
Consuming messages from the beginning
01:36
30
Running multiple consumers
01:31
31
Running multiple producers
03:14
32
What was changed in the Kafka logs
09:33
33
SECTION 7 Introduction
01:36
34
What is Apache Kafka
04:01
35
Broker
02:28
36
Broker cluster
01:54
37
Zookeeper
01:59
38
Zookeeper ensemble
03:30
39
Multiple Kafka clusters
02:20
40
Default ports of Zookeeper and Broker
03:49
41
Kafka Topic
02:33
42
Message structure
03:34
43
Topics and Partitions
04:34
44
Spreading messages across partitions
05:28
45
Partition Leader and Followers
06:24
46
Controller and it's responsibilities
05:17
47
How Producers write messages to the topic
02:15
48
How Consumers read messages from the topic
03:34
49
SECTION 8 Introduction
00:46
50
GitHub repository and list of basic Kafka commands
07:33
51
Diagrams for the course
03:43
52
SECTION 9 Introduction
00:53
53
Cleaning up existing kafka installation
02:14
54
Creating topic with multiple partitions
06:00
55
How messages were spread across different partitions
06:20
56
Reading messages from specific partition
02:45
57
Reading messages from specific offset in specific partition
05:45
58
Reading details about topic and __consumer_offsets topic
06:46
59
Summary for multiple partitions example
01:57
60
SECTION 10 Introduction
00:58
61
Example overview - run multiple brokers
01:26
62
Creating separate configuration files for brokers
05:35
63
Launching three brokers
02:41
64
Getting cluster information and broker details from Zookeeper
02:48
65
Creating multiple-partition topic in the Kafka cluster
03:17
66
Looking at logs folders of every broker
02:14
67
Producing and consuming messages in the cluster
03:34
68
Details about topic in the cluster
03:28
69
Simulating broker failure in the cluster
05:55
70
Summary for broker cluster and topic without replication
01:46
71
SECTION 11 Introduction
01:24
72
Preparing for the next example with replication
02:42
73
Launching brokers and creating topic with replication
04:42
74
Observing logs folder and details of the topic
06:06
75
Producing and consuming messages in the topic with replication
06:10
76
Observing how messages were stored in the partitions on different brokers
03:06
77
Bringing down one of three brokers and observing changes
03:34
78
Bringing down another broker in the cluster
03:45
79
Bringing back both brokers
01:58
80
Summary for replication
01:37
81
SECTION 12 Introduction
00:52
82
Example with consumer groups overview
01:01
83
Exploring default consumer groups
10:01
84
Starting consumer in the custom consumer group
08:13
85
Starting second consumer in the same consumer group
04:41
86
Launching one more consumer in the same group
02:00
87
Idle consumers in the group
05:52
88
Summary for consumer groups
01:49
89
SECTION 13 Introduction
00:49
90
Overview of the performance testing example
01:19
91
Starting cluster and launching basic performance test
04:23
92
Increasing performance test parameters
02:52
93
Testing consumer performance
03:19
94
Getting non-zero LAG values for consumers
05:19
95
Performance test example summary
00:59
96
SECTION 14 Introduction
01:46
97
Project Files for the Java section
01:06
98
Installing IntelliJ editor
03:23
99
Creating and configuring Maven project
05:41
100
Starting Kafka Cluster
01:39
101
Creating Java Producer
07:05
102
Continue Creating Java Producer
07:07
103
Launching Java Producer
05:05
104
Explaining most common Producer parameters
12:05
105
Modifying Serializer type
05:28
106
Producing meaningful messages with delay
06:07
107
Refactoring Producer by adding previous example
02:53
108
Creating consumer with autocommitting - PART 1
07:49
109
Creating consumer with autocommitting - PART 2
08:32
110
Consumer parameters overview
06:25
111
Consumer with Manual Committing
10:34
112
Consumer with Partitions Assignment
05:37
113
Launching multipile consumers in the same consumer group
07:42
114
CHALLENGE - Subscribe vs Assign with consumer groups
01:55
115
SECTION 15 Introduction
00:55
116
Installing Node.js with NPM
01:56
117
Starting up Kafka cluster with 3 brokers
02:21
118
Initializing Node.js project
01:38
119
Final Node.js project files
01:15
120
Creating basic Node.js producer
07:03
121
Producing random animal names
06:43
122
Creating Node.js consumer
04:54
123
SECTION 16 Introduction
00:44
124
Installing Python
01:41
125
Final Python project files
00:44
126
Launching basic Python producer
05:36
127
Launching consumer and receiving messages
03:37
128
Generating fake names in the messages by producer
06:02
129
Course Summary
01:08
Unlock unlimited learning

Get instant access to all 128 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.

Learn more about subscription