Machine Learning: Natural Language Processing in Python (V2)

22h 4m 2s
English
Paid

Course description

Welcome to Machine Learning: Natural Language Processing in Python (Version 2). NLP: Use Markov Models, NLTK, Artificial Intelligence, Deep Learning, Machine Learning, and Data Science in Python.
Read more about the course

This is a massive 4-in-1 course covering:

1) Vector models and text preprocessing methods

2) Probability models and Markov models

3) Machine learning methods

4) Deep learning and neural network methods

In part 1, which covers vector models and text preprocessing methods, you will learn about why vectors are so essential in data science and artificial intelligence. You will learn about various techniques for converting text into vectors, such as the CountVectorizer and TF-IDF, and you'll learn the basics of neural embedding methods like word2vec, and GloVe.

You'll then apply what you learned for various tasks, such as:

  • Text classification

  • Document retrieval / search engine

  • Text summarization

Along the way, you'll also learn important text preprocessing steps, such as tokenization, stemming, and lemmatization.

You'll be introduced briefly to classic NLP tasks such as parts-of-speech tagging.

In part 2, which covers probability models and Markov models, you'll learn about one of the most important models in all of data science and machine learning in the past 100 years. It has been applied in many areas in addition to NLP, such as finance, bioinformatics, and reinforcement learning.

In this course, you'll see how such probability models can be used in various ways, such as:

  • Building a text classifier

  • Article spinning

  • Text generation (generating poetry)

Importantly, these methods are an essential prerequisite for understanding how the latest Transformer (attention) models such as BERT and GPT-3 work. Specifically, we'll learn about 2 important tasks which correspond with the pre-training objectives for BERT and GPT.

In part 3, which covers machine learning methods, you'll learn about more of the classic NLP tasks, such as:

  • Spam detection

  • Sentiment analysis

  • Latent semantic analysis (also known as latent semantic indexing)

  • Topic modeling

This section will be application-focused rather than theory-focused, meaning that instead of spending most of our effort learning about the details of various ML algorithms, you'll be focusing on how they can be applied to the above tasks.

Of course, you'll still need to learn something about those algorithms in order to understand what's going on. The following algorithms will be used:

  • Naive Bayes

  • Logistic Regression

  • Principal Components Analysis (PCA) / Singular Value Decomposition (SVD)

  • Latent Dirichlet Allocation (LDA)

These are not just "any" machine learning / artificial intelligence algorithms but rather, ones that have been staples in NLP and are thus an essential part of any NLP course.

In part 4, which covers deep learning methods, you'll learn about modern neural network architectures that can be applied to solve NLP tasks. Thanks to their great power and flexibility, neural networks can be used to solve any of the aforementioned tasks in the course.

You'll learn about:

  • Feedforward Artificial Neural Networks (ANNs)

  • Embeddings

  • Convolutional Neural Networks (CNNs)

  • Recurrent Neural Networks (RNNs)

The study of RNNs will involve modern architectures such as the LSTM and GRU which have been widely used by Google, Amazon, Apple, Facebook, etc. for difficult tasks such as language translation, speech recognition, and text-to-speech.

Obviously, as the latest Transformers (such as BERT and GPT-3) are examples of deep neural networks, this part of the course is an essential prerequisite for understanding Transformers.

Watch Online

This is a demo lesson (10:00 remaining)

You can watch up to 10 minutes for free. Subscribe to unlock all 152 lessons in this course and access 10,000+ hours of premium content across all courses.

View Pricing

Watch Online Machine Learning: Natural Language Processing in Python (V2)

0:00
/
#1: Introduction and Outline

All Course Lessons (152)

#Lesson TitleDurationAccess
1
Introduction and Outline Demo
10:41
2
Are You Beginner, Intermediate, or Advanced? All are OK!
05:07
3
Where to get the Code
04:18
4
How to use Github & Extra Coding Tips (Optional)
08:57
5
Vector Models & Text Preprocessing Intro
03:41
6
Basic Definitions for NLP
05:02
7
What is a Vector?
10:42
8
Bag of Words
02:33
9
Count Vectorizer (Theory)
13:46
10
Tokenization
14:46
11
Stopwords
04:52
12
Stemming and Lemmatization
12:04
13
Stemming and Lemmatization Demo
13:27
14
Count Vectorizer (Code)
15:44
15
Vector Similarity
11:36
16
TF-IDF (Theory)
14:17
17
(Interactive) Recommender Exercise Prompt
02:37
18
TF-IDF (Code)
20:26
19
Word-to-Index Mapping
10:55
20
How to Build TF-IDF From Scratch
15:09
21
Neural Word Embeddings
10:16
22
Neural Word Embeddings Demo
11:26
23
Vector Models & Text Preprocessing Summary
03:51
24
Text Summarization Preview
01:22
25
How To Do NLP In Other Languages
10:42
26
Suggestion Box
03:11
27
Probabilistic Models (Introduction)
04:47
28
Markov Models Section Introduction
02:43
29
The Markov Property
07:35
30
The Markov Model
12:31
31
Probability Smoothing and Log-Probabilities
07:51
32
Building a Text Classifier (Theory)
07:30
33
Building a Text Classifier (Exercise Prompt)
06:34
34
Building a Text Classifier (Code pt 1)
10:33
35
Building a Text Classifier (Code pt 2)
12:07
36
Language Model (Theory)
10:16
37
Language Model (Exercise Prompt)
06:53
38
Language Model (Code pt 1)
10:46
39
Language Model (Code pt 2)
09:26
40
Markov Models Section Summary
03:01
41
Article Spinning - Problem Description
07:56
42
Article Spinning - N-Gram Approach
04:25
43
Article Spinner Exercise Prompt
05:46
44
Article Spinner in Python (pt 1)
17:33
45
Article Spinner in Python (pt 2)
10:01
46
Case Study: Article Spinning Gone Wrong
05:43
47
Section Introduction
04:51
48
Ciphers
04:00
49
Language Models (Review)
16:07
50
Genetic Algorithms
21:24
51
Code Preparation
04:47
52
Code pt 1
03:07
53
Code pt 2
07:21
54
Code pt 3
04:53
55
Code pt 4
04:04
56
Code pt 5
07:13
57
Code pt 6
05:26
58
Cipher Decryption - Additional Discussion
02:57
59
Section Conclusion
06:01
60
Machine Learning Models (Introduction)
05:51
61
Spam Detection - Problem Description
06:33
62
Naive Bayes Intuition
11:38
63
Spam Detection - Exercise Prompt
02:08
64
Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 1)
12:26
65
Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 2)
11:03
66
Spam Detection in Python
16:24
67
Sentiment Analysis - Problem Description
07:28
68
Logistic Regression Intuition (pt 1)
17:37
69
Multiclass Logistic Regression (pt 2)
06:53
70
Logistic Regression Training and Interpretation (pt 3)
08:16
71
Sentiment Analysis - Exercise Prompt
04:01
72
Sentiment Analysis in Python (pt 1)
10:39
73
Sentiment Analysis in Python (pt 2)
08:29
74
Text Summarization Section Introduction
05:35
75
Text Summarization Using Vectors
05:31
76
Text Summarization Exercise Prompt
01:51
77
Text Summarization in Python
12:41
78
TextRank Intuition
08:04
79
TextRank - How It Really Works (Advanced)
10:51
80
TextRank Exercise Prompt (Advanced)
01:24
81
TextRank in Python (Advanced)
14:34
82
Text Summarization in Python - The Easy Way (Beginner)
06:07
83
Text Summarization Section Summary
03:23
84
Topic Modeling Section Introduction
03:08
85
Latent Dirichlet Allocation (LDA) - Essentials
10:55
86
LDA - Code Preparation
03:42
87
LDA - Maybe Useful Picture (Optional)
01:53
88
Latent Dirichlet Allocation (LDA) - Intuition (Advanced)
14:55
89
Topic Modeling with Latent Dirichlet Allocation (LDA) in Python
11:39
90
Non-Negative Matrix Factorization (NMF) Intuition
10:22
91
Topic Modeling with Non-Negative Matrix Factorization (NMF) in Python
05:34
92
Topic Modeling Section Summary
01:38
93
LSA / LSI Section Introduction
04:07
94
SVD (Singular Value Decomposition) Intuition
12:12
95
LSA / LSI: Applying SVD to NLP
07:47
96
Latent Semantic Analysis / Latent Semantic Indexing in Python
09:16
97
LSA / LSI Exercises
06:01
98
Deep Learning Introduction (Intermediate-Advanced)
04:58
99
The Neuron - Section Introduction
02:21
100
Fitting a Line
14:24
101
Classification Code Preparation
07:21
102
Text Classification in Tensorflow
12:10
103
The Neuron
09:59
104
How does a model learn?
10:54
105
The Neuron - Section Summary
01:52
106
ANN - Section Introduction
07:00
107
Forward Propagation
09:41
108
The Geometrical Picture
09:44
109
Activation Functions
17:19
110
Multiclass Classification
08:42
111
ANN Code Preparation
04:36
112
Text Classification ANN in Tensorflow
05:44
113
Text Preprocessing Code Preparation
11:34
114
Text Preprocessing in Tensorflow
05:31
115
Embeddings
10:14
116
CBOW (Advanced)
04:08
117
CBOW Exercise Prompt
00:58
118
CBOW in Tensorflow (Advanced)
19:25
119
ANN - Section Summary
01:33
120
Aside: How to Choose Hyperparameters (Optional)
06:22
121
CNN - Section Introduction
04:35
122
What is Convolution?
16:39
123
What is Convolution? (Pattern Matching)
05:57
124
What is Convolution? (Weight Sharing)
06:42
125
Convolution on Color Images
15:59
126
CNN Architecture
20:59
127
CNNs for Text
08:08
128
Convolutional Neural Network for NLP in Tensorflow
05:32
129
CNN - Section Summary
01:28
130
RNN - Section Introduction
04:47
131
Simple RNN / Elman Unit (pt 1)
09:21
132
Simple RNN / Elman Unit (pt 2)
09:43
133
RNN Code Preparation
09:46
134
RNNs: Paying Attention to Shapes
08:27
135
GRU and LSTM (pt 1)
17:36
136
GRU and LSTM (pt 2)
11:37
137
RNN for Text Classification in Tensorflow
05:57
138
Parts-of-Speech (POS) Tagging in Tensorflow
19:51
139
Named Entity Recognition (NER) in Tensorflow
05:14
140
Exercise: Return to CNNs (Advanced)
03:20
141
RNN - Section Summary
01:59
142
Anaconda Environment Setup
20:21
143
How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow
17:15
144
How to Code by Yourself (part 1)
15:55
145
How to Code by Yourself (part 2)
09:24
146
Proof that using Jupyter Notebook is the same as not using it
12:30
147
How to Succeed in this Course (Long Version)
10:25
148
Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced?
22:05
149
Machine Learning and AI Prerequisite Roadmap (pt 1)
11:19
150
Machine Learning and AI Prerequisite Roadmap (pt 2)
16:08
151
What is the Appendix?
02:49
152
BONUS
05:32

Unlock unlimited learning

Get instant access to all 151 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.

Learn more about subscription

Comments

0 comments

Want to join the conversation?

Sign in to comment

Similar courses

Python Programming for Developers

Python Programming for Developers

Sources: codewithmosh (Mosh Hamedani)
Finally, a Python course that doesn’t insult your intelligence and assume you know nothing. Let’s face it, you know what a variable and function are - you don’t need me to tell ...
11 hours 14 minutes 25 seconds
Data Analysis with Pandas and Python

Data Analysis with Pandas and Python

Sources: udemy
Welcome to the most comprehensive Pandas course available on Udemy! An excellent choice for both beginners and experts looking to expand their knowledge on one of the most popul...
19 hours 5 minutes 40 seconds
The Ultimate Django Series: Part 2

The Ultimate Django Series: Part 2

Sources: codewithmosh (Mosh Hamedani)
Do you want to take your Django skills to the next level and become that professional back-end developer that companies love to hire? This is exactly the course you need.
5 hours 41 minutes 6 seconds
Python for Data Engineers

Python for Data Engineers

Sources: Andreas Kretz
If you want to take your skills in Data Engineering to the next level - you are in the right place. Python has become the primary language for data analysis...
2 hours 21 minutes 18 seconds