Machine Learning & Containers on AWS

1h 33m 34s
English
Paid

Course description

In this practical course, you will learn to build a complete data pipeline on the AWS platform, from acquiring data from the Twitter API to analysis, storage, and visualization. You will create your own machine learning algorithm and deploy it on AWS using Lambda. Additionally, you will set up a Postgres database using Amazon RDS. For visualizing the results, you will develop an interactive dashboard with Streamlit and gain experience deploying it in containers using Elastic Container Registry (ECR) and Elastic Container Service (ECS). Moreover, the course will introduce you to the Poetry tool and teach you how to manage your project's dependencies.
Read more about the course

Course Structure

Twitter API

Twitter API is a great place for accessing open data. You will learn how to configure access to the API and retrieve tweets from a user's feed for further processing. We will delve into API configuration and the data format (payload) it returns.

RDS Database

Every platform needs a data storage. You will learn how to set up a Postgres database in Amazon RDS and understand why we will be saving JSON tweets in this database. You will also master working with virtual private clouds (VPC) to make the database accessible from the internet. With PGAdmin, you will create tables and execute queries on the database.

NLP Lambda

For text analysis, we will use a ready-made machine learning algorithm from the Natural Language Toolkit (NLTK) library. You will create a Lambda function to retrieve tweets from the API, determine their sentiment, and save the results in the database.

To run the Lambda function, you will learn to connect the necessary dependencies through layers - how to import prepared K-Layers and create your own layer. You will also learn how to set up an automatic Lambda function trigger using Event Bridge.

Dependency Management and Streamlit Application

For visualizing results, you will create an application using Streamlit. You will set up a local development environment with Anaconda3 and create a conda virtual environment. Using the provided Git repository, you will learn to manage project dependencies with Poetry. We will go through the application code step by step and demonstrate how to run it in a new virtual environment for testing.

Deploying Streamlit Application in ECS

Once the visualization is ready, you will learn to work with Docker images and containers in AWS. You will create an Elastic Container Registry (ECR) and set up AWS CLI. You will learn to create user groups and individual users with restricted access rights in IAM.

After building the Docker image, you will upload it to ECR, configure an ECS Fargate cluster, and deploy your Streamlit application as a task on the platform.

Watch Online

Join premium to watch
Go to premium
# Title Duration
1 Introduction video 02:39
2 Project architecture explained 02:07
3 Relational DB 01:27
4 RDS setup 02:38
5 Setting VPC inbound rules for internet access 02:13
6 PG Admin installation & S3 config 04:06
7 Lambda intro & IAM setup 03:12
8 Create Lambda function 01:25
9 The Lambda function code explained 08:23
10 Insert the code into your Lambda function 00:57
11 Add layers to Lambda from Klayers 05:33
12 Create & configure custom layers for twython & psycopg2 04:41
13 Test Lambda & set environment variables 04:54
14 Schedule your Lambda with Event Bridge 03:16
15 Setup virtual conda environment 04:08
16 Poetry dependency installs & run Streamlit UI locally 05:58
17 Streamlit app code explained 07:53
18 Setup container registry ECR 01:53
19 AWS CLI install and ECR login 05:20
20 Dockerfile explained, Docker image build & push image to ECR 02:53
21 Create ECS Fargate cluster 01:35
22 ECS task IAM configuration & Streamlit task creation 05:00
23 Fixing the ECS task 05:15
24 Stopping the task on ECS after you are finished 01:00
25 Conclusion & outlook 05:08

Comments

0 comments

Want to join the conversation?

Sign in to comment

Similar courses

Build a Large Language Model (From Scratch)

Build a Large Language Model (From Scratch)

Sources: Sebastian Raschka
"Creating a Large Language Model from Scratch" is a practical guide that will teach you step by step how to create, train, and fine-tune large language models..
Case Study in A/B Testing

Case Study in A/B Testing

Sources: LunarTech
Examples from practice in A/B testing - this course will introduce you to the methods of designing, conducting, and analyzing experiments using A/B...
1 hour 56 minutes 17 seconds
Data Preparation & Cleaning for ML

Data Preparation & Cleaning for ML

Sources: Andrew Jones
Have you ever heard the expression "data preparation and cleaning"? This is perhaps the most important part of the entire machine learning process.
3 hours 7 minutes 23 seconds
PyTorch for Deep Learning with Python Bootcamp

PyTorch for Deep Learning with Python Bootcamp

Sources: udemy
Welcome to the best online course for learning about Deep Learning with Python and PyTorch! PyTorch is an open source deep learning platform that provides a sea
17 hours 2 minutes 14 seconds
Time Series Analysis, Forecasting, and Machine Learning

Time Series Analysis, Forecasting, and Machine Learning

Sources: udemy
Let me cut to the chase. This is not your average Time Series Analysis course. This course covers modern developments such as deep learning, time series classif
22 hours 47 minutes 45 seconds