Learn the Basics of Apache Kafka from Scratch and Master Building Reliable, Scalable Real-time Data Processing Systems.
Course Overview
In this course, you will become familiar with Kafka's architecture, the principles of producers and consumers, ensuring reliability of stream processing, message delivery semantics, and tools from the Kafka ecosystem—such as Kafka Connect and Schema Registry.
Apache Kafka is at the core of real-time data processing systems used by the most innovative companies in the world. If you want to understand how massive data streams are processed "on the fly," this course will be an excellent starting point.
What You Will Learn
Understanding Kafka Architecture
Gain practical insights into how Kafka is structured and why this is important. You'll explore the system's architecture and its key components.
Building Producers and Consumers
Learn to create robust producers and consumers for data streams, crucial for processing real-time information effectively.
Data Processing and Delivery Methods
Understand the trade-offs between different data delivery methods and how Kafka ensures reliable data processing.
Tools in the Kafka Ecosystem
- Kafka Connect: Discover how to integrate Kafka with other systems using connectors.
- Schema Registry: Learn about managing and validating data formats in Kafka pipelines.
Course Outcomes
By the end of the course, you won't just understand how Kafka works—you will be able to use it to build reliable, scalable real-time data processing systems, equipped to handle real-world challenges.