Skip to main content

Building a Real-Time ML System. Together

48h 20m 35s
English
Paid

Course description

Learn to design, develop, deploy, and scale end-to-end real-time ML systems using Python, Rust, LLMs, and Kubernetes.

Read more about the course

What awaits you:

  1. More than 150 hours of recorded sessions from the previous 4 cohorts.
  2. Complete source codes of projects: a cryptocurrency price prediction system and a credit card fraud detection system.
  3. 50 hours of live coding and practice in each cohort.


In this interactive practical course, we will create a real-time machine learning system from scratch, deploy it, and scale it. In previous cohorts, we built a cryptocurrency price predictor, while in the next one, we will create a transaction fraud detection system.

The course is suitable for ML engineers, data scientists, and developers who are already familiar with the basics of ML (at least one trained model) and are ready to move from theory to practice.

You will learn to:

  1. Develop microservice architectures with real-time ML.
  2. Apply a universal approach: Feature → Training → Inference Pipeline.
  3. Use modern tools: Kafka, Feature Store, Experiment Tracker, Model Registry, and Kubernetes.


This is not theory or "passive learning". This is a chance to build a working system and boost your career.

Watch Online

This is a demo lesson (10:00 remaining)

You can watch up to 10 minutes for free. Subscribe to unlock all 188 lessons in this course and access 10,000+ hours of premium content across all courses.

View Pricing
0:00
/
#1: What's new in this cohort?

All Course Lessons (188)

#Lesson TitleDurationAccess
1
What's new in this cohort? Demo
07:24
2
How to install the tools
19:55
3
How to create the local Kubernetes cluster
08:26
4
How to open a Github issue
06:38
5
ML System design - Part 1
31:30
6
ML System design - Part 2
11:02
7
Dev and Prod environments - Say hi to Marius!
04:24
8
Bootstrap the uv workspace & Install Kafka in our dev environment
26:26
9
Install Kafka UI in our dev environment
14:37
10
Push fake data into Kafka
25:08
11
Push real trade data to Kafka - Part 1
31:11
12
Push real trade data to Kafka - Part 2
16:05
13
.gitignore
03:33
14
Why Docker?
04:23
15
Wrap up
03:05
16
3 questions for YOU
03:21
17
Working inside the devcontainer - Let's (re)create the dev cluster
15:38
18
Dockerfile for the trades service
10:49
19
How to deploy the trades service to the dev Kubernetes cluster
27:00
20
Debugging, debugging, debugging
30:36
21
Decouple config parameters from business logic code in the trades service
31:11
22
Let's recap
03:28
23
Pre-commits for automatic code linting and formatting
16:52
24
Candles service boilerplate code
26:10
25
Open question -> How to do a double port-forwarding?
01:38
26
Wrap up
01:57
27
Plan for today
04:36
28
Redeploy the trades service to the dev cluster
28:16
29
Add key=product_id to the trade messages - otherwise the candles service cannot process them
28:13
30
Deploy the candles service to the dev kubernetes cluster
14:21
31
Horizontaly scaling of the candles service - Kafka consumer groups to the rescue!
14:22
32
Build and push the docker image for the candles to the production Github Container Registry
30:26
33
Deploy the candles service to PROD cluster
27:07
34
Technical-indicators service boilerplate code
21:13
35
Wrap up
01:51
36
Quick recap before we start office hours
10:59
37
How to spin up a PROD Kubernetes cluster
28:41
38
How to Monitor a Kubernetes cluster
06:26
39
One of my candles servie replica is waiting for messages. Why?
06:49
40
A detour around ZenML and Flyte
06:59
41
What's the project repo structure of this course?
11:59
42
If you like it, please recommend us :-)
03:35
43
We want you Marius
13:48
44
How to inspect docker build logs - How to do port-forwarding of services
14:40
45
More port forwarding
19:39
46
How to push data into a Kafka topics?
08:58
47
Again: 1 kafka topic partition and 2 consumer replicas means one of them will be IDLE
16:20
48
Problems building the Docker image?
22:14
49
Wrap up
00:24
50
Goals for today
07:07
51
What are technical indicators and how to compute them in real time
10:18
52
Custom stateful transformation to compute indicators - Part 1
24:11
53
Other libraries for real time data processing
12:02
54
Custom stateful transformation to compute indicators - Part 2
34:35
55
Custom stateful transformation to compute indicators - Part 3
09:42
56
Write Dockerfile with talib library
26:31
57
Some homework for you
14:15
58
Deploying the technical indicators service to Kubernetes
28:37
59
Wrap up
01:46
60
Goals for today
04:46
61
Installing RisingWave in Kubernetes
20:48
62
Pull-based ingestion from Kafka topic to RisingWave
14:11
63
Pull-based vs Push-based data ingestion
28:16
64
Installing Grafana and adding RisingWave as a data source
14:57
65
Plotting candles with Grafana
20:40
66
Ingesting historical trades from Kraken REST API
39:01
67
Custom stateful transformation to compute indicators - Part 2 + Homework
24:19
68
Wrap up
03:19
69
Goals for today
07:05
70
Bash script to build and push Docker images to either `dev` or `prod` Kubernetes cluster
22:07
71
Bash script to deploy services to either `dev` or `prod`
17:15
72
Squashing 2 bugs in the trades service
21:55
73
Deploying the trades-historical to Kubernetes - Part 1
13:44
74
Deploying the trades-historical to Kubernetes - Part 2
04:36
75
Adding custom timestamp extractor in our candles service
04:08
76
Deploying the whole backfill pipeline
30:16
77
ConfigMap for our backfill pipeline
38:02
78
How to scale the backfill pipeline to process 100x volume of trades
09:07
79
Wrap up
01:43
80
Goals for today
11:56
81
Installing MLflow in Kubernetes cluster
28:50
82
Start building the training pipeline -> Load data from RisingWave
23:22
83
Adding the target column to the dataframe
08:44
84
Data validation with Great Expectations
14:01
85
Automatic Exploratory Data Analysis (EDA)
17:10
86
Instrumenting our training runs with MLflow
30:47
87
Build a baseline model
24:31
88
Question -> Do we need GPUs to train our model?
01:22
89
Wrap up
05:57
90
Goals for today
06:31
91
Finding a good model candidate with LazyPredict
42:48
92
Logging model candidate table to MLflow
22:07
93
What is model hyper-parameter tuning?
10:56
94
Hyperparameter tuning with Optuna
27:02
95
Fixing things
28:06
96
Scaling the inputs for our regularised linear model (HuberRegressor) - Part 1
15:05
97
Scaling the inputs for our regularised linear model (HuberRegressor) - Part 2
16:01
98
Wrap up
01:57
99
Goals for today
09:02
100
Refactoring the model selection step of the training pipeline
19:06
101
Checking and dropping NaN values
34:22
102
Validating and pushing the model to the registry
15:34
103
Extracting training inputs into a TrainingConfig
14:04
104
Homework -> Add data and model drift reports to each training run using Evidently
09:13
105
Dockerfile for the training-pipeline
14:34
106
What is Kustomize?
18:57
107
Job manifest for the training-pipeline
18:55
108
Debugging the Job (wout success so far)
13:22
109
Wrap up
01:26
110
Goals for today
04:23
111
Deploying the training pipeline as a CronJob
16:52
112
Adjust the deployment script to use kustomize build if there is a kustomization.yaml
05:55
113
Building the prediction generator - Part 1 - Bug fixing the name of the model in the registry
27:26
114
Building the prediction generator - Part 2 - Prediction handler
29:08
115
Building the prediction generator - Part 3 - Loading the model signature
22:49
116
Building the prediction generator - Part 4 - Saving predictions to RisingWave table
35:58
117
Building the prediction generator - Part 5 - Deploying the prediction generator
24:40
118
Wrap up
03:00
119
Goals for today
04:05
120
Fixing a bug in the prediction_handler
11:14
121
Question -> Where to log what?
03:16
122
Rebuild docker image and deploy prediction generator to Kubernetes
15:02
123
Let's build the prediction API in Rust - Part 1 - The tools you need
11:22
124
Let's build the prediction API in Rust - Part 2 - REST API skeleton
21:19
125
Let's build the prediction API in Rust - Part 3 - Unwrapping the unwrappable
16:21
126
Let's build the prediction API in Rust - Part 4 - Predictions endpoint
14:36
127
Let's build the prediction API in Rust - Part 5 - Connecting to PostgreSQL
21:37
128
Let's build the prediction API in Rust - Part 6 - Debugging
43:48
129
Let's build the prediction API in Rust - Part 7 - Squashing the bug
02:57
130
Wrap up
03:35
131
5/14/2025 11:02 AM CEST recording
01:21
132
5/14/2025 11:04 AM CEST recording
31:34
133
5/14/2025 11:35 AM CEST recording
05:22
134
5/14/2025 11:41 AM CEST recording
06:07
135
5/14/2025 11:48 AM CEST recording
15:02
136
5/14/2025 12:03 PM CEST recording
06:23
137
5/14/2025 12:10 PM CEST recording
08:10
138
5/14/2025 12:18 PM CEST recording
06:35
139
5/14/2025 12:34 PM CEST recording
35:50
140
5/14/2025 1:10 PM CEST recording
07:47
141
5/14/2025 1:18 PM CEST recording
15:00
142
5/14/2025 1:34 PM CEST recording
09:10
143
5/14/2025 1:43 PM CEST recording
09:03
144
5/14/2025 1:52 PM CEST recording
04:17
145
5/14/2025 1:57 PM CEST recording
01:42
146
Before we begin...
07:00
147
Goals for today
08:03
148
Questions
06:31
149
Adding the PgPool to the app State (so we don't need to recreate every time we get a request)
22:48
150
Creating a RisingWave materialized view with the latest predictions for each coin
16:14
151
Custom config objet to load and hold env variable values
17:36
152
Adding the Config to the app State
09:48
153
Adding some logging
11:41
154
Dockerizing our Prediction API Rust service
23:09
155
Mad scientist experiment to reduct the Docker image size with a scratch layer
07:04
156
Deploying to Kubernetes
27:15
157
Plan for the last 3 sessions
03:42
158
Wrap up
01:44
159
Goals for today
08:12
160
How to download crypto news from a REST API (Cryptopanic)
26:21
161
Load Cryptopanic API using env variables and pydantic-settings
07:12
162
Custom Quixstreams Stateful Source to ingest news into Kafka
27:10
163
Inspecting the news messages - Kafka UI
18:18
164
News sentiment service - iteration 1
21:56
165
Unpacking sentiment scores as N kafka messages
21:28
166
BAML to build LLMs with structured output (like the sentiment-extractor we want to build!)
26:55
167
Testing our BAML function
05:39
168
Wrap up
02:02
169
A question I forgot to answer!
01:09
170
6.4.2025 11:03 AM CEST recording
04:01
171
6.4.2025 11:08 AM CEST recording
31:30
172
6.4.2025 11:40 AM CEST recording
10:01
173
6.4.2025 11:50 AM CEST recording
33:31
174
6.4.2025 12:35 PM CEST recording
24:07
175
6.4.2025 1:00 PM CEST recording
04:38
176
6.4.2025 1:05 PM CEST recording
09:07
177
6.4.2025 1:14 PM CEST recording
27:14
178
6.4.2025 1:42 PM CEST recording
17:16
179
6.4.2025 1:59 PM CEST recording
04:20
180
Goals for today
05:22
181
Implement the evaluation metric
22:00
182
Manual prompt improvement
07:06
183
Automatic prompt optimization
11:21
184
How to use open-weights LLMs with Ollama
34:36
185
Kubernetes manifests to deploy news and news-sentiment services
30:16
186
MARIUS -> How to set up a GPU node in a production Kubernetes cluster
44:51
187
Time to say (see you later)!
03:29
188
Wrap up
04:36

Unlock unlimited learning

Get instant access to all 187 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.

Learn more about subscription

Books

Read Book Building a Real-Time ML System. Together

#Title
1slides_session_1
2slides_session_2
3slides_session_3
4slides_session_4
5slides_session_5
6slides_session_6
7slides_session_7
8slides_session_8
9slides_session_9
10slides_session_10
11slides_session_11
12slides_session_12
13slides_session_13
14slides_session_14
15slides_session_15

Comments

0 comments

Want to join the conversation?

Sign in to comment

Similar courses

AngularJS Authentication with JWT

AngularJS Authentication with JWT

Sources: egghead
JSON Web Tokens (JWT) are a more modern approach to authentication. As the web moves to a greater separation between the client and server, JWT provides a terrific alternative t...
31 minutes 11 seconds
AngularJS Performance

AngularJS Performance

Sources: ultimatecourses.com
Performance tuning for fast AngularJS. Master the internal workings and build blazing fast apps. Learn how to write highly performant AngularJS apps and master the internal work...
2 hours 52 minutes 22 seconds
Using Angular 2 Patterns in Angular 1.x Apps

Using Angular 2 Patterns in Angular 1.x Apps

Sources: egghead
Implementing modern component-based architecture in your new or existing Angular 1.x web application is a breath of fresh air. In this course, you are going to
1 hour 3 minutes 22 seconds
AngularJS Fundamentals

AngularJS Fundamentals

Sources: ultimatecourses.com
Start building modern AngularJS applications with component architecture and best practices. Build modern AngularJS applications. From MVC patterns through to Directives, Forms,...
2 hours 41 minutes 33 seconds