2022 Python for Machine Learning & Data Science Masterclass
Welcome to the most complete course on learning Data Science and Machine Learning on the internet! After teaching over 2 million students I've worked for over a year to put together what I believe to be the best way to go from zero to hero for data science and machine learning in Python! This course is designed for the student who already knows some Python and is ready to dive deeper into using those Python skills for Data Science and Machine Learning.
More
The typical starting salary for a data scientists can be over $150,000 dollars, and we've created this course to help guide students to learning a set of skills to make them extremely hirable in today's workplace environment.
We'll cover everything you need to know for the full data science and machine learning tech stack required at the world's top companies. Our students have gotten jobs at McKinsey, Facebook, Amazon, Google, Apple, Asana, and other top tech companies! We've structured the course using our experience teaching both online and in-person to deliver a clear and structured approach that will guide you through understanding not just how to use data science and machine learning libraries, but why we use them. This course is balanced between practical real world case studies and mathematical theory behind the machine learning algorithms.
We cover advanced machine learning algorithms that most other courses don't! Including advanced regularization methods and state of the art unsupervised learning methods, such as DBSCAN.
This comprehensive course is designed to be on par with Bootcamps that usually cost thousands of dollars and includes the following topics:
Programming with Python
NumPy with Python
Deep dive into Pandas for Data Analysis
Full understanding of Matplotlib Programming Library
Deep dive into seaborn for data visualizations
Machine Learning with SciKit Learn, including:
Linear Regression
Regularization
Lasso Regression
Ridge Regression
Elastic Net
K Nearest Neighbors
K Means Clustering
Decision Trees
Random Forests
Natural Language Processing
Support Vector Machines
Hierarchal Clustering
DBSCAN
PCA
Model Deployment
and much, much more!
As always, we're grateful for the chance to teach you data science, machine learning, and python and hope you will join us inside the course to boost your skillset!
Watch Online 2022 Python for Machine Learning & Data Science Masterclass
# | Title | Duration |
---|---|---|
1 | COURSE OVERVIEW LECTURE - PLEASE DO NOT SKIP! | 04:18 |
2 | Anaconda Python and Jupyter Install and Setup | 13:50 |
3 | Environment Setup | 09:09 |
4 | Python Crash Course - Part One | 16:08 |
5 | Python Crash Course - Part Two | 12:08 |
6 | Python Crash Course - Part Three | 11:20 |
7 | Python Crash Course - Exercise Questions | 01:30 |
8 | Python Crash Course - Exercise Solutions | 09:27 |
9 | Machine Learning Pathway | 10:17 |
10 | Introduction to NumPy | 02:15 |
11 | NumPy Arrays | 22:42 |
12 | NumPy Indexing and Selection | 11:07 |
13 | NumPy Operations | 08:15 |
14 | NumPy Exercises | 01:19 |
15 | Numpy Exercises - Solutions | 07:06 |
16 | Introduction to Pandas | 04:41 |
17 | Series - Part One | 09:29 |
18 | Series - Part Two | 10:42 |
19 | DataFrames - Part One - Creating a DataFrame | 19:28 |
20 | DataFrames - Part Two - Basic Properties | 08:19 |
21 | DataFrames - Part Three - Working with Columns | 13:58 |
22 | DataFrames - Part Four - Working with Rows | 14:31 |
23 | Pandas - Conditional Filtering | 17:42 |
24 | Pandas - Useful Methods - Apply on Single Column | 13:48 |
25 | Pandas - Useful Methods - Apply on Multiple Columns | 17:24 |
26 | Pandas - Useful Methods - Statistical Information and Sorting | 15:49 |
27 | Missing Data - Overview | 12:00 |
28 | Missing Data - Pandas Operations | 18:33 |
29 | GroupBy Operations - Part One | 15:50 |
30 | GroupBy Operations - Part Two - MultiIndex | 14:19 |
31 | Combining DataFrames - Concatenation | 10:25 |
32 | Combining DataFrames - Inner Merge | 12:05 |
33 | Combining DataFrames - Left and Right Merge | 06:08 |
34 | Combining DataFrames - Outer Merge | 10:39 |
35 | Pandas - Text Methods for String Data | 16:06 |
36 | Pandas - Time Methods for Date and Time Data | 21:01 |
37 | Pandas Input and Output - CSV Files | 10:21 |
38 | Pandas Input and Output - HTML Tables | 14:42 |
39 | Pandas Input and Output - Excel Files | 07:21 |
40 | Pandas Input and Output - SQL Databases | 18:20 |
41 | Pandas Pivot Tables | 21:16 |
42 | Pandas Project Exercise Overview | 05:27 |
43 | Pandas Project Exercise Solutions | 26:32 |
44 | Introduction to Matplotlib | 04:07 |
45 | Matplotlib Basics | 12:36 |
46 | Matplotlib - Understanding the Figure Object | 07:33 |
47 | Matplotlib - Implementing Figures and Axes | 14:32 |
48 | Matplotlib - Figure Parameters | 04:57 |
49 | Matplotlib - Subplots Functionality | 19:18 |
50 | Matplotlib Styling - Legends | 07:03 |
51 | Matplotlib Styling - Colors and Styles | 14:30 |
52 | Advanced Matplotlib Commands (Optional) | 03:53 |
53 | Matplotlib Exercise Questions Overview | 06:11 |
54 | Matplotlib Exercise Questions - Solutions | 16:40 |
55 | Introduction to Seaborn | 03:55 |
56 | Scatterplots with Seaborn | 18:20 |
57 | Distribution Plots - Part One - Understanding Plot Types | 09:36 |
58 | Distribution Plots - Part Two - Coding with Seaborn | 16:15 |
59 | Categorical Plots - Statistics within Categories - Understanding Plot Types | 05:41 |
60 | Categorical Plots - Statistics within Categories - Coding with Seaborn | 09:16 |
61 | Categorical Plots - Distributions within Categories - Understanding Plot Types | 13:21 |
62 | Categorical Plots - Distributions within Categories - Coding with Seaborn | 17:58 |
63 | Seaborn - Comparison Plots - Understanding the Plot Types | 05:33 |
64 | Seaborn - Comparison Plots - Coding with Seaborn | 09:48 |
65 | Seaborn Grid Plots | 13:40 |
66 | Seaborn - Matrix Plots | 13:19 |
67 | Seaborn Plot Exercises Overview | 06:45 |
68 | Seaborn Plot Exercises Solutions | 14:34 |
69 | Capstone Project Overview | 12:49 |
70 | Capstone Project Solutions - Part One | 17:16 |
71 | Capstone Project Solutions - Part Two | 14:51 |
72 | Capstone Project Solutions - Part Three | 19:50 |
73 | Introduction to Machine Learning Overview Section | 05:14 |
74 | Why Machine Learning? | 09:16 |
75 | Types of Machine Learning Algorithms | 07:48 |
76 | Supervised Machine Learning Process | 13:42 |
77 | Companion Book - Introduction to Statistical Learning | 02:53 |
78 | Introduction to Linear Regression Section | 01:40 |
79 | Linear Regression - Algorithm History | 09:23 |
80 | Linear Regression - Understanding Ordinary Least Squares | 15:44 |
81 | Linear Regression - Cost Functions | 08:13 |
82 | Linear Regression - Gradient Descent | 12:00 |
83 | Python coding Simple Linear Regression | 19:38 |
84 | Overview of Scikit-Learn and Python | 08:27 |
85 | Linear Regression - Scikit-Learn Train Test Split | 15:49 |
86 | Linear Regression - Scikit-Learn Performance Evaluation - Regression | 15:45 |
87 | Linear Regression - Residual Plots | 13:58 |
88 | Linear Regression - Model Deployment and Coefficient Interpretation | 17:47 |
89 | Polynomial Regression - Theory and Motivation | 08:00 |
90 | Polynomial Regression - Creating Polynomial Features | 10:55 |
91 | Polynomial Regression - Training and Evaluation | 09:45 |
92 | Bias Variance Trade-Off | 10:35 |
93 | Polynomial Regression - Choosing Degree of Polynomial | 13:38 |
94 | Polynomial Regression - Model Deployment | 06:08 |
95 | Regularization Overview | 06:40 |
96 | Feature Scaling | 10:00 |
97 | Introduction to Cross Validation | 12:54 |
98 | Regularization Data Setup | 08:38 |
99 | L2 Regularization - Ridge Regression Theory | 14:30 |
100 | L2 Regularization - Ridge Regression - Python Implementation | 17:43 |
101 | L1 Regularization - Lasso Regression - Background and Implementation | 15:03 |
102 | L1 and L2 Regularization - Elastic Net | 18:08 |
103 | Linear Regression Project - Data Overview | 04:31 |
104 | Introduction to Feature Engineering and Data Preparation | 15:29 |
105 | Dealing with Outliers | 26:34 |
106 | Dealing with Missing Data : Part One - Evaluation of Missing Data | 10:43 |
107 | Dealing with Missing Data : Part Two - Filling or Dropping data based on Rows | 20:41 |
108 | Dealing with Missing Data : Part 3 - Fixing data based on Columns | 23:17 |
109 | Dealing with Categorical Data - Encoding Options | 12:48 |
110 | Section Overview and Introduction | 03:15 |
111 | Cross Validation - Test | Train Split | 11:21 |
112 | Cross Validation - Test | Validation | Train Split | 14:49 |
113 | Cross Validation - cross_val_score | 11:38 |
114 | Cross Validation - cross_validate | 06:57 |
115 | Grid Search | 12:15 |
116 | Linear Regression Project Overview | 03:27 |
117 | Linear Regression Project - Solutions | 12:11 |
118 | Introduction to Logistic Regression Section | 05:28 |
119 | Logistic Regression - Theory and Intuition - Part One: The Logistic Function | 05:37 |
120 | Logistic Regression - Theory and Intuition - Part Two: Linear to Logistic | 04:55 |
121 | Logistic Regression - Theory and Intuition - Linear to Logistic Math | 17:01 |
122 | Logistic Regression - Theory and Intuition - Best fit with Maximum Likelihood | 15:43 |
123 | Logistic Regression with Scikit-Learn - Part One - EDA | 13:58 |
124 | Logistic Regression with Scikit-Learn - Part Two - Model Training | 06:39 |
125 | Classification Metrics - Confusion Matrix and Accuracy | 09:46 |
126 | Classification Metrics - Precison, Recall, F1-Score | 06:01 |
127 | Classification Metrics - ROC Curves | 07:14 |
128 | Logistic Regression with Scikit-Learn - Part Three - Performance Evaluation | 15:57 |
129 | Multi-Class Classification with Logistic Regression - Part One - Data and EDA | 08:08 |
130 | Multi-Class Classification with Logistic Regression - Part Two - Model | 15:48 |
131 | Logistic Regression Exercise Project Overview | 04:00 |
132 | Logistic Regression Project Exercise - Solutions | 21:37 |
133 | Introduction to KNN Section | 02:12 |
134 | KNN Classification - Theory and Intuition | 11:19 |
135 | KNN Coding with Python - Part One | 13:41 |
136 | KNN Coding with Python - Part Two - Choosing K | 23:26 |
137 | KNN Classification Project Exercise Overview | 03:19 |
138 | KNN Classification Project Exercise Solutions | 14:13 |
139 | Introduction to Support Vector Machines | 01:30 |
140 | History of Support Vector Machines | 04:42 |
141 | SVM - Theory and Intuition - Hyperplanes and Margins | 13:26 |
142 | SVM - Theory and Intuition - Kernel Intuition | 04:58 |
143 | SVM - Theory and Intuition - Kernel Trick and Mathematics | 20:51 |
144 | SVM with Scikit-Learn and Python - Classification Part One | 11:00 |
145 | SVM with Scikit-Learn and Python - Classification Part Two | 16:03 |
146 | SVM with Scikit-Learn and Python - Regression Tasks | 21:00 |
147 | Support Vector Machine Project Overview | 04:28 |
148 | Support Vector Machine Project Solutions | 18:32 |
149 | Introduction to Tree Based Methods | 01:23 |
150 | Decision Tree - History | 09:05 |
151 | Decision Tree - Terminology | 04:13 |
152 | Decision Tree - Understanding Gini Impurity | 07:53 |
153 | Constructing Decision Trees with Gini Impurity - Part One | 07:33 |
154 | Constructing Decision Trees with Gini Impurity - Part Two | 11:25 |
155 | Coding Decision Trees - Part One - The Data | 19:19 |
156 | Coding Decision Trees - Part Two -Creating the Model | 20:57 |
157 | Introduction to Random Forests Section | 01:47 |
158 | Random Forests - History and Motivation | 11:39 |
159 | Random Forests - Key Hyperparameters | 03:00 |
160 | Random Forests - Number of Estimators and Features in Subsets | 10:57 |
161 | Random Forests - Bootstrapping and Out-of-Bag Error | 12:47 |
162 | Coding Classification with Random Forest Classifier - Part One | 11:37 |
163 | Coding Classification with Random Forest Classifier - Part Two | 22:23 |
164 | Coding Regression with Random Forest Regressor - Part One - Data | 04:29 |
165 | Coding Regression with Random Forest Regressor - Part Two - Basic Models | 13:34 |
166 | Coding Regression with Random Forest Regressor - Part Three - Polynomials | 10:31 |
167 | Coding Regression with Random Forest Regressor - Part Four - Advanced Models | 10:37 |
168 | Introduction to Boosting Section | 01:48 |
169 | Boosting Methods - Motivation and History | 06:12 |
170 | AdaBoost Theory and Intuition | 19:52 |
171 | AdaBoost Coding Part One - The Data | 11:14 |
172 | AdaBoost Coding Part Two - The Model | 18:10 |
173 | Gradient Boosting Theory | 10:23 |
174 | Gradient Boosting Coding Walkthrough | 12:49 |
175 | Introduction to Supervised Learning Capstone Project | 14:24 |
176 | Solution Walkthrough - Supervised Learning Project - Data and EDA | 18:19 |
177 | Solution Walkthrough - Supervised Learning Project - Cohort Analysis | 23:10 |
178 | Solution Walkthrough - Supervised Learning Project - Tree Models | 21:24 |
179 | Introduction to NLP and Naive Bayes Section | 02:37 |
180 | Naive Bayes Algorithm - Part One - Bayes Theorem | 08:05 |
181 | Naive Bayes Algorithm - Part Two - Model Algorithm | 17:56 |
182 | Feature Extraction from Text - Part One - Theory and Intuition | 10:34 |
183 | Feature Extraction from Text - Coding Count Vectorization Manually | 18:54 |
184 | Feature Extraction from Text - Coding with Scikit-Learn | 11:25 |
185 | Natural Language Processing - Classification of Text - Part One | 11:24 |
186 | Natural Language Processing - Classification of Text - Part Two | 10:19 |
187 | Text Classification Project Exercise Overview | 04:38 |
188 | Text Classification Project Exercise Solutions | 15:38 |
189 | Unsupervised Learning Overview | 08:18 |
190 | Introduction to K-Means Clustering Section | 02:15 |
191 | Clustering General Overview | 10:37 |
192 | K-Means Clustering Theory | 11:31 |
193 | K-Means Clustering - Coding Part One | 19:49 |
194 | K-Means Clustering Coding Part Two | 17:19 |
195 | K-Means Clustering Coding Part Three | 14:33 |
196 | K-Means Color Quantization - Part One | 13:54 |
197 | K-Means Color Quantization - Part Two | 14:34 |
198 | K-Means Clustering Exercise Overview | 07:48 |
199 | K-Means Clustering Exercise Solution - Part One | 13:11 |
200 | K-Means Clustering Exercise Solution - Part Two | 15:52 |
201 | K-Means Clustering Exercise Solution - Part Three | 08:21 |
202 | Introduction to Hierarchical Clustering | 00:51 |
203 | Hierarchical Clustering - Theory and Intuition | 11:49 |
204 | Hierarchical Clustering - Coding Part One - Data and Visualization | 16:13 |
205 | Hierarchical Clustering - Coding Part Two - Scikit-Learn | 28:23 |
206 | Introduction to DBSCAN Section | 01:01 |
207 | DBSCAN - Theory and Intuition | 17:27 |
208 | DBSCAN versus K-Means Clustering | 12:24 |
209 | DBSCAN - Hyperparameter Theory | 07:16 |
210 | DBSCAN - Hyperparameter Tuning Methods | 21:56 |
211 | DBSCAN - Outlier Project Exercise Overview | 05:56 |
212 | DBSCAN - Outlier Project Exercise Solutions | 23:21 |
213 | Introduction to Principal Component Analysis | 02:48 |
214 | PCA Theory and Intuition - Part One | 10:25 |
215 | PCA Theory and Intuition - Part Two | 11:13 |
216 | PCA - Manual Implementation in Python | 18:17 |
217 | PCA - SciKit-Learn | 12:10 |
218 | PCA - Project Exercise Overview | 07:22 |
219 | PCA - Project Exercise Solution | 17:04 |
220 | Model Deployment Section Overview | 02:20 |
221 | Model Deployment Considerations | 06:52 |
222 | Model Persistence | 21:08 |
223 | Model Deployment as an API - General Overview | 07:42 |
224 | Model API - Creating the Script | 17:01 |
225 | Testing the API | 07:50 |