The Complete Apache Kafka Practical Guide

8h 38m 15s
English
Paid
July 20, 2024

This is the most complete practical Apache Kafka guide that includes tons of practical activities. Most important is that you'l learn how Apache Kafka works and knowing it you will be able much more easier use its features and fix mistakes in the future after you'll finish this course. You can have zero knowledge about Apache Kafka, Java, Node.js or Python. All will be taught from scratch, from basic to advanced features. If you want to get deep knowledge of Apache Kafka this course is for you!

More

We will start by installing Apache Kafka on your computer, VPS (Virtual Private Server) or Virtual Machine on your computer. You will learn that installation of Apache Kafka is pretty easy - you just need to download archive with executable scripts and run them. Also you will learn and practice how to run multiple brokers on the same computer.

Afterwards we will jump into tons of practice activities and use different Apache Kafka features and built-in scripts. You will launch Zookeeper, multiple Brokers, Console Consumer and Console Producer. Also you will test performance of the Kafka Cluster using built-in utility called Performance Monitor.

In practice sections you will perform multiple practice Apache activities:

  1. Create cluster with multiple brokers

  2. Create topic with multiple partitions spread across different brokers

  3. Create topics with replication factor that allows you to store copy of every message on different brokers for redundancy

  4. Produce messages using built-in Console Producer

  5. Consume messages using built-in Console Consumer

  6. Launch multiple consumers in the same consumer group

  7. Launch Performance Monitor for testing Consumers and Producers performance and speed

You will also learn and practice how to use Apache Kafka API to create your own Consumers and Producers

  1. Create Java Maven project

  2. Launch Producer and Consumer using Java

  3. Launch multiple consumers in the same Consumer Group

  4. Understand and practice difference between "subscribe" and "assign"

  5. Create Node.js project

  6. Launch Producers and Consumers using Node.js

  7. Create Python project

  8. Launch Producers and Consumers using Python

During the course you will need to view and edit text files. For that you will NOT use terminal. Instead you will use GUI application VisualStudio Code. Also all configuration files and project files are available in the GitHub repository. This means that during this course you will also learn how to use :

  • Git and GitHub

  • VisualStudio Code

Watch Online The Complete Apache Kafka Practical Guide

Join premium to watch
Go to premium
# Title Duration
1 Apache Kafka Installation Overview 01:07
2 Installing Apache Kafka on the Mac and Unix-like systems 00:53
3 Installing Apache Kafka on the Mac 05:25
4 Installing Ubuntu on MacOS using VirtualBox 12:41
5 SECTION 2 Introduction 01:00
6 Creating remote Ubuntu Virtual Private Server 06:16
7 Installing Apache Kafka on Virtual Private Server 07:05
8 SECTION 3 Introduction 01:03
9 Installing Apache Kafka on Windows 08:26
10 Starting Zookeeper and Kafka server on Windows 06:32
11 Installing Ubuntu on Windows using VirtualBox 11:39
12 Installing Apache Kafka on Ubuntu using GUI 06:16
13 SECTION 4 Introduction 01:03
14 Observing contents of the Kafka folder 05:56
15 Reading and editing Kafka files using VisualStudio Code 06:20
16 Trying to start Kafka Server 04:11
17 Observing Kafka Server logs 01:43
18 Starting Zookeeper 03:12
19 Starting Kafka Server while Zookeeper is up and running 06:12
20 Observing logs folder and current kafka server setup 04:44
21 SECTION 5 Introduction 00:30
22 How to connect to Kafka cluster 02:19
23 Create new Kafka topic 05:05
24 What happened after creation of the new topic 03:36
25 Read details about topic 05:22
26 SECTION 6 Introduction 00:44
27 Send some messages using Kafka Console Producer 03:13
28 Consuming messages using Kafka Console Consumer 03:19
29 Consuming messages from the beginning 01:36
30 Running multiple consumers 01:31
31 Running multiple producers 03:14
32 What was changed in the Kafka logs 09:33
33 SECTION 7 Introduction 01:36
34 What is Apache Kafka 04:01
35 Broker 02:28
36 Broker cluster 01:54
37 Zookeeper 01:59
38 Zookeeper ensemble 03:30
39 Multiple Kafka clusters 02:20
40 Default ports of Zookeeper and Broker 03:49
41 Kafka Topic 02:33
42 Message structure 03:34
43 Topics and Partitions 04:34
44 Spreading messages across partitions 05:28
45 Partition Leader and Followers 06:24
46 Controller and it's responsibilities 05:17
47 How Producers write messages to the topic 02:15
48 How Consumers read messages from the topic 03:34
49 SECTION 8 Introduction 00:46
50 GitHub repository and list of basic Kafka commands 07:33
51 Diagrams for the course 03:43
52 SECTION 9 Introduction 00:53
53 Cleaning up existing kafka installation 02:14
54 Creating topic with multiple partitions 06:00
55 How messages were spread across different partitions 06:20
56 Reading messages from specific partition 02:45
57 Reading messages from specific offset in specific partition 05:45
58 Reading details about topic and __consumer_offsets topic 06:46
59 Summary for multiple partitions example 01:57
60 SECTION 10 Introduction 00:58
61 Example overview - run multiple brokers 01:26
62 Creating separate configuration files for brokers 05:35
63 Launching three brokers 02:41
64 Getting cluster information and broker details from Zookeeper 02:48
65 Creating multiple-partition topic in the Kafka cluster 03:17
66 Looking at logs folders of every broker 02:14
67 Producing and consuming messages in the cluster 03:34
68 Details about topic in the cluster 03:28
69 Simulating broker failure in the cluster 05:55
70 Summary for broker cluster and topic without replication 01:46
71 SECTION 11 Introduction 01:24
72 Preparing for the next example with replication 02:42
73 Launching brokers and creating topic with replication 04:42
74 Observing logs folder and details of the topic 06:06
75 Producing and consuming messages in the topic with replication 06:10
76 Observing how messages were stored in the partitions on different brokers 03:06
77 Bringing down one of three brokers and observing changes 03:34
78 Bringing down another broker in the cluster 03:45
79 Bringing back both brokers 01:58
80 Summary for replication 01:37
81 SECTION 12 Introduction 00:52
82 Example with consumer groups overview 01:01
83 Exploring default consumer groups 10:01
84 Starting consumer in the custom consumer group 08:13
85 Starting second consumer in the same consumer group 04:41
86 Launching one more consumer in the same group 02:00
87 Idle consumers in the group 05:52
88 Summary for consumer groups 01:49
89 SECTION 13 Introduction 00:49
90 Overview of the performance testing example 01:19
91 Starting cluster and launching basic performance test 04:23
92 Increasing performance test parameters 02:52
93 Testing consumer performance 03:19
94 Getting non-zero LAG values for consumers 05:19
95 Performance test example summary 00:59
96 SECTION 14 Introduction 01:46
97 Project Files for the Java section 01:06
98 Installing IntelliJ editor 03:23
99 Creating and configuring Maven project 05:41
100 Starting Kafka Cluster 01:39
101 Creating Java Producer 07:05
102 Continue Creating Java Producer 07:07
103 Launching Java Producer 05:05
104 Explaining most common Producer parameters 12:05
105 Modifying Serializer type 05:28
106 Producing meaningful messages with delay 06:07
107 Refactoring Producer by adding previous example 02:53
108 Creating consumer with autocommitting - PART 1 07:49
109 Creating consumer with autocommitting - PART 2 08:32
110 Consumer parameters overview 06:25
111 Consumer with Manual Committing 10:34
112 Consumer with Partitions Assignment 05:37
113 Launching multipile consumers in the same consumer group 07:42
114 CHALLENGE - Subscribe vs Assign with consumer groups 01:55
115 SECTION 15 Introduction 00:55
116 Installing Node.js with NPM 01:56
117 Starting up Kafka cluster with 3 brokers 02:21
118 Initializing Node.js project 01:38
119 Final Node.js project files 01:15
120 Creating basic Node.js producer 07:03
121 Producing random animal names 06:43
122 Creating Node.js consumer 04:54
123 SECTION 16 Introduction 00:44
124 Installing Python 01:41
125 Final Python project files 00:44
126 Launching basic Python producer 05:36
127 Launching consumer and receiving messages 03:37
128 Generating fake names in the messages by producer 06:02
129 Course Summary 01:08

Similar courses to The Complete Apache Kafka Practical Guide

Zero to Full Stack Hero

Zero to Full Stack Heropapareact.com

Duration 101 hours 29 minutes 59 seconds
Agile Business Analysis

Agile Business Analysisudemy

Duration 1 hour 35 minutes 36 seconds
Windows Server 2016 Administration

Windows Server 2016 Administrationudemy

Duration 10 hours 23 minutes 31 seconds