AI Engineering Bootcamp: Build, Train & Deploy Models with AWS SageMaker
11h 59m 21s
English
Paid
August 30, 2024
Learn to create full-cycle AI applications using AWS SageMaker: from collecting and preparing your own data to training and modifying models, as well as deploying and scaling your AI application in the real world.
Watch Online AI Engineering Bootcamp: Build, Train & Deploy Models with AWS SageMaker
Join premium to watch
Go to premium
# | Title | Duration |
---|---|---|
1 | AI Engineering Bootcamp: Learn AWS SageMaker with Patrik Szepesi | 01:36 |
2 | Course Introduction | 08:43 |
3 | Setting Up Our AWS Account | 04:32 |
4 | Set Up IAM Roles + Best Practices | 07:40 |
5 | AWS Security Best Practices | 07:02 |
6 | Set Up AWS SageMaker Domain | 02:23 |
7 | UI Domain Change | 00:43 |
8 | Setting Up SageMaker Environment | 05:09 |
9 | SageMaker Studio and Pricing | 08:45 |
10 | Setup: SageMaker Server + PyTorch | 06:09 |
11 | HuggingFace Models, Sentiment Analysis, and AutoScaling | 18:35 |
12 | Get Dataset for Multiclass Text Classification | 06:04 |
13 | Creating Our AWS S3 Bucket | 03:53 |
14 | Uploading Our Training Data to S3 | 01:27 |
15 | Exploratory Data Analysis - Part 1 | 13:22 |
16 | Exploratory Data Analysis - Part 2 | 06:08 |
17 | Data Visualization and Best Practices | 11:09 |
18 | Setting Up Our Training Job Notebook + Reasons to Use SageMaker | 18:25 |
19 | Python Script for HuggingFace Estimator | 13:37 |
20 | Creating Our Optional Experiment Notebook - Part 1 | 03:22 |
21 | Creating Our Optional Experiment Notebook - Part 2 | 04:02 |
22 | Encoding Categorical Labels to Numeric Values | 13:25 |
23 | Understanding the Tokenization Vocabulary | 15:06 |
24 | Encoding Tokens | 10:57 |
25 | Practical Example of Tokenization and Encoding | 12:49 |
26 | Creating Our Dataset Loader Class | 16:57 |
27 | Setting Pytorch DataLoader | 15:10 |
28 | Which Path Will You Take? | 01:32 |
29 | DistilBert vs. Bert Differences | 04:47 |
30 | Embeddings In A Continuous Vector Space | 07:41 |
31 | Introduction To Positional Encodings | 05:14 |
32 | Positional Encodings - Part 1 | 04:15 |
33 | Positional Encodings - Part 2 (Even and Odd Indices) | 10:11 |
34 | Why Use Sine and Cosine Functions | 05:09 |
35 | Understanding the Nature of Sine and Cosine Functions | 09:53 |
36 | Visualizing Positional Encodings in Sine and Cosine Graphs | 09:25 |
37 | Solving the Equations to Get the Values for Positional Encodings | 18:08 |
38 | Introduction to Attention Mechanism | 03:03 |
39 | Query, Key and Value Matrix | 18:11 |
40 | Getting Started with Our Step by Step Attention Calculation | 06:54 |
41 | Calculating Key Vectors | 20:06 |
42 | Query Matrix Introduction | 10:21 |
43 | Calculating Raw Attention Scores | 21:25 |
44 | Understanding the Mathematics Behind Dot Products and Vector Alignment | 13:33 |
45 | Visualizing Raw Attention Scores in 2D | 05:43 |
46 | Converting Raw Attention Scores to Probability Distributions with Softmax | 09:17 |
47 | Normalization | 03:20 |
48 | Understanding the Value Matrix and Value Vector | 09:08 |
49 | Calculating the Final Context Aware Rich Representation for the Word "River" | 10:46 |
50 | Understanding the Output | 01:59 |
51 | Understanding Multi Head Attention | 11:56 |
52 | Multi Head Attention Example and Subsequent Layers | 09:52 |
53 | Masked Language Learning | 02:30 |
54 | Exercise: Imposter Syndrome | 02:57 |
55 | Creating Our Custom Model Architecture with PyTorch | 17:15 |
56 | Adding the Dropout, Linear Layer, and ReLU to Our Model | 15:32 |
57 | Creating Our Accuracy Function | 13:05 |
58 | Creating Our Train Function | 19:09 |
59 | Finishing Our Train Function | 08:18 |
60 | Setting Up the Validation Function | 13:41 |
61 | Passing Parameters In SageMaker | 04:06 |
62 | Setting Up Model Parameters For Training | 04:28 |
63 | Understanding The Mathematics Behind Cross Entropy Loss | 05:40 |
64 | Finishing Our Script.py File | 06:57 |
65 | Quota Increase | 07:36 |
66 | Starting Our Training Job | 08:16 |
67 | Debugging Our Training Job With AWS CloudWatch | 14:17 |
68 | Analyzing Our Training Job Results | 05:47 |
69 | Creating Our Inference Script For Our PyTorch Model | 08:35 |
70 | Finishing Our PyTorch Inference Script | 09:13 |
71 | Setting Up Our Deployment | 07:31 |
72 | Deploying Our Model To A SageMaker Endpoint | 08:55 |
73 | Introduction to Endpoint Load Testing | 04:20 |
74 | Creating Our Test Data for Load Testing | 10:03 |
75 | Upload Testing Data to S3 | 01:04 |
76 | Creating Our Model for Load Testing | 03:59 |
77 | Starting Our Load Test Job | 07:15 |
78 | Analyze Load Test Results | 10:17 |
79 | Deploying Our Endpoint | 03:51 |
80 | Creating Lambda Function to Call Our Endpoint | 10:27 |
81 | Setting Up Our AWS API Gateway | 05:28 |
82 | Testing Our Model with Postman, API Gateway and Lambda | 05:40 |
83 | Cleaning Up Resources | 02:52 |
84 | Thank You! | 01:18 |
Similar courses to AI Engineering Bootcamp: Build, Train & Deploy Models with AWS SageMaker
Devops Fundamentals - CI/CD with AWS +Docker+Ansible+Jenkinsudemy
Duration 8 hours 46 minutes 37 seconds
Course
AWS AppSync & Amplify with React & GraphQL - Complete Guideudemy
Duration 11 hours 11 minutes 36 seconds
Course
Build an End-to-End Web App from Scratch in AWSzerotomastery.io
Duration 31 minutes 54 seconds
Course
DevOps Deployment Automation with Terraform, AWS and Dockerudemy
Duration 10 hours 59 minutes 9 seconds
Course
Complete Terraform Course - Beginner to AdvancedudemyTechworld with Nana
Duration 6 hours 24 minutes 44 seconds
Course
AWS Certified Security - SpecialtyAdrian Cantrill
Duration 39 hours 15 minutes 45 seconds
Course
AWS & Typescript Masterclass - CDK V2, Serverless, Reactudemy
Duration 10 hours 48 minutes 18 seconds
Course
Serverless Framework Bootcamp: Node.js, AWS & Microservicesudemy
Duration 5 hours 24 minutes 21 seconds
Course