Complete linear algebra: theory and implementation
You need to learn linear algebra! Linear algebra is perhaps the most important branch of mathematics for computational sciences, including machine learning, AI, data science, statistics, simulations, computer graphics, multivariate analyses, matrix decompositions, signal processing, and so on. You need to know applied linear algebra, not just abstract linear algebra!
More
The way linear algebra is presented in 30-year-old textbooks is different from how professionals use linear algebra in computers to solve real-world applications in machine learning, data science, statistics, and signal processing. For example, the "determinant" of a matrix is important for linear algebra theory, but should you actually use the determinant in practical applications? The answer may surprise you, and it's in this course!
If you are interested in learning the mathematical concepts linear algebra and matrix analysis, but also want to apply those concepts to data analyses on computers (e.g., statistics or signal processing), then this course is for you!
Unique aspects of this course
Clear and comprehensible explanations of concepts and theories in linear algebra.
Several distinct explanations of the same ideas, which is a proven technique for learning.
Visualization using graphs, numbers, and spaces that strengthens the geometric intuition of linear algebra.
Implementations in MATLAB and Python. Com'on, in the real world, you never solve math problems by hand! You need to know how to implement math in software!
Beginning to intermediate topics, including vectors, matrix multiplications, least-squares projections, eigendecomposition, and singular-value decomposition.
Strong focus on modern applications-oriented aspects of linear algebra and matrix analysis.
Intuitive visual explanations of diagonalization, eigenvalues and eigenvectors, and singular value decomposition.
Benefits of learning linear algebra
Understand statistics including least-squares, regression, and multivariate analyses.
Improve mathematical simulations in engineering, computational biology, finance, and physics.
Understand data compression and dimension-reduction (PCA, SVD, eigendecomposition).
Understand the math underlying machine learning and linear classification algorithms.
Deeper knowledge of signal processing methods, particularly filtering and multivariate subspace methods.
Explore the link between linear algebra, matrices, and geometry.
Why I am qualified to teach this course:
I have been using linear algebra extensively in my research and teaching (primarily in MATLAB) for many years. I have written several textbooks about data analysis, programming, and statistics, that rely extensively on concepts in linear algebra.
- Basic understanding of high-school algebra (e.g., solve for x in 2x=5)
- Interest in learning about matrices and vectors!
- (optional) Computer with MATLAB, Octave, or Python (or Jupyter)
- Anyone interested in learning about matrices and vectors
- Students who want supplemental instruction/practice for a linear algebra course
- Engineers who want to refresh their knowledge of matrices and decompositions
- Biologists who want to learn more about the math behind computational biology
- Data scientists (linear algebra is everywhere in data science!)
- Statisticians
- Someone who wants to know the important math underlying machine learning
- Someone who studied theoretical linear algebra and who wants to implement concepts in computers
- Computational scientists (statistics, biological, engineering, neuroscience, psychology, physics, etc.)
- Someone who wants to learn about eigendecomposition, diagonalization, and singular value decomposition!
What you'll learn:
- Understand theoretical concepts in linear algebra, including proofs
- Implement linear algebra concepts in scientific programming languages (MATLAB, Python)
- Apply linear algebra concepts to real datasets
- Ace your linear algebra exam!
- Apply linear algebra on computers with confidence
- Gain additional insights into solving problems in linear algebra, including homeworks and applications
- Be confident in learning advanced linear algebra topics
- Understand some of the important maths underlying machine learning
- * Manually corrected closed-captions *
Watch Online Complete linear algebra: theory and implementation
# | Title | Duration |
---|---|---|
1 | What is linear algebra? | 08:04 |
2 | Linear algebra applications | 05:58 |
3 | An enticing start to a linear algebra course! | 13:14 |
4 | How best to learn from this course | 04:00 |
5 | Maximizing your Udemy experience | 07:58 |
6 | Using MATLAB, Octave, or Python in this course | 07:31 |
7 | Algebraic and geometric interpretations of vectors | 12:46 |
8 | Vector addition and subtraction | 08:27 |
9 | Vector-scalar multiplication | 09:08 |
10 | Vector-vector multiplication: the dot product | 10:12 |
11 | Dot product properties: associative, distributive, commutative | 18:56 |
12 | Code challenge: dot products with matrix columns | 08:46 |
13 | Vector length | 06:43 |
14 | Dot product geometry: sign and orthogonality | 23:39 |
15 | Code challenge: dot product sign and scalar multiplication | 12:06 |
16 | Code challenge: is the dot product commutative? | 09:33 |
17 | Vector Hadamard multiplication | 03:44 |
18 | Outer product | 10:18 |
19 | Vector cross product | 09:06 |
20 | Vectors with complex numbers | 08:18 |
21 | Hermitian transpose (a.k.a. conjugate transpose) | 16:22 |
22 | Interpreting and creating unit vectors | 07:59 |
23 | Code challenge: dot products with unit vectors | 13:34 |
24 | Dimensions and fields in linear algebra | 07:55 |
25 | Subspaces | 15:51 |
26 | Subspaces vs. subsets | 05:48 |
27 | Span | 13:30 |
28 | Linear independence | 15:35 |
29 | Basis | 11:52 |
30 | Matrix terminology and dimensionality | 08:15 |
31 | A zoo of matrices | 17:20 |
32 | Matrix addition and subtraction | 08:29 |
33 | Matrix-scalar multiplication | 02:34 |
34 | Code challenge: is matrix-scalar multiplication a linear operation? | 07:29 |
35 | Transpose | 10:25 |
36 | Complex matrices | 01:52 |
37 | Diagonal and trace | 09:08 |
38 | Code challenge: linearity of trace | 09:38 |
39 | Broadcasting matrix arithmetic | 14:14 |
40 | Introduction to standard matrix multiplication | 10:28 |
41 | Four ways to think about matrix multiplication | 11:56 |
42 | Code challenge: matrix multiplication by layering | 09:46 |
43 | Matrix multiplication with a diagonal matrix | 03:43 |
44 | Order-of-operations on matrices | 08:16 |
45 | Matrix-vector multiplication | 16:44 |
46 | 2D transformation matrices | 15:33 |
47 | Code challenge: Pure and impure rotation matrices | 12:39 |
48 | Code challenge: Geometric transformations via matrix multiplications | 15:59 |
49 | Additive and multiplicative matrix identities | 06:20 |
50 | Additive and multiplicative symmetric matrices | 15:17 |
51 | Hadamard (element-wise) multiplication | 05:01 |
52 | Code challenge: symmetry of combined symmetric matrices | 12:04 |
53 | Multiplication of two symmetric matrices | 13:22 |
54 | Code challenge: standard and Hadamard multiplication for diagonal matrices | 06:28 |
55 | Code challenge: Fourier transform via matrix multiplication! | 11:21 |
56 | Frobenius dot product | 11:17 |
57 | Matrix norms | 18:12 |
58 | Code challenge: conditions for self-adjoint | 11:53 |
59 | What about matrix division? | 04:25 |
60 | Rank: concepts, terms, and applications | 10:51 |
61 | Computing rank: theory and practice | 23:02 |
62 | Rank of added and multiplied matrices | 11:47 |
63 | Code challenge: reduced-rank matrix via multiplication | 10:39 |
64 | Code challenge: scalar multiplication and rank | 12:11 |
65 | Rank of A^TA and AA^T | 10:42 |
66 | Code challenge: rank of multiplied and summed matrices | 07:07 |
67 | Making a matrix full-rank by "shifting" | 14:13 |
68 | Code challenge: is this vector in the span of this set? | 11:47 |
69 | Course tangent: self-accountability in online learning | 03:04 |
70 | Column space of a matrix | 13:30 |
71 | Column space, visualized in code | 06:36 |
72 | Row space of a matrix | 04:26 |
73 | Null space and left null space of a matrix | 14:40 |
74 | Column/left-null and row/null spaces are orthogonal | 10:48 |
75 | Dimensions of column/row/null spaces | 08:11 |
76 | Example of the four subspaces | 11:10 |
77 | More on Ax=b and Ax=0 | 07:53 |
78 | Systems of equations: algebra and geometry | 19:40 |
79 | Converting systems of equations to matrix equations | 04:24 |
80 | Gaussian elimination | 14:43 |
81 | Echelon form and pivots | 07:22 |
82 | Reduced row echelon form | 18:30 |
83 | Code challenge: RREF of matrices with different sizes and ranks | 12:17 |
84 | Matrix spaces after row reduction | 09:24 |
85 | Determinant: concept and applications | 06:00 |
86 | Determinant of a 2x2 matrix | 07:04 |
87 | Code challenge: determinant of small and large singular matrices | 11:08 |
88 | Determinant of a 3x3 matrix | 13:14 |
89 | Code challenge: large matrices with row exchanges | 06:33 |
90 | Find matrix values for a given determinant | 04:52 |
91 | Code challenge: determinant of shifted matrices | 18:28 |
92 | Code challenge: determinant of matrix product | 10:38 |
93 | Matrix inverse: Concept and applications | 12:41 |
94 | Computing the inverse in code | 06:32 |
95 | Inverse of a 2x2 matrix | 07:56 |
96 | The MCA algorithm to compute the inverse | 13:59 |
97 | Code challenge: Implement the MCA algorithm!! | 18:40 |
98 | Computing the inverse via row reduction | 16:41 |
99 | Code challenge: inverse of a diagonal matrix | 10:51 |
100 | Left inverse and right inverse | 10:15 |
101 | One-sided inverses in code | 12:41 |
102 | Proof: the inverse is unique | 03:17 |
103 | Pseudo-inverse, part 1 | 11:35 |
104 | Code challenge: pseudoinverse of invertible matrices | 06:03 |
105 | Projections in R^2 | 10:00 |
106 | Projections in R^N | 15:25 |
107 | Orthogonal and parallel vector components | 12:39 |
108 | Code challenge: decompose vector to orthogonal components | 16:41 |
109 | Orthogonal matrices | 12:03 |
110 | Gram-Schmidt procedure | 12:44 |
111 | QR decomposition | 21:00 |
112 | Code challenge: Gram-Schmidt algorithm | 20:36 |
113 | Matrix inverse via QR decomposition | 01:46 |
114 | Code challenge: Inverse via QR | 14:20 |
115 | Code challenge: Prove and demonstrate the Sherman-Morrison inverse | 17:27 |
116 | Code challenge: A^TA = R^TR | 06:01 |
117 | Introduction to least-squares | 13:13 |
118 | Least-squares via left inverse | 10:08 |
119 | Least-squares via orthogonal projection | 09:19 |
120 | Least-squares via row-reduction | 18:21 |
121 | Model-predicted values and residuals | 07:00 |
122 | Least-squares application 1 | 18:47 |
123 | Least-squares application 2 | 29:41 |
124 | Code challenge: Least-squares via QR decomposition | 10:11 |
125 | What are eigenvalues and eigenvectors? | 12:53 |
126 | Finding eigenvalues | 20:44 |
127 | Shortcut for eigenvalues of a 2x2 matrix | 02:54 |
128 | Code challenge: eigenvalues of diagonal and triangular matrices | 14:25 |
129 | Code challenge: eigenvalues of random matrices | 11:05 |
130 | Finding eigenvectors | 15:57 |
131 | Eigendecomposition by hand: two examples | 09:28 |
132 | Diagonalization | 14:31 |
133 | Matrix powers via diagonalization | 20:37 |
134 | Code challenge: eigendecomposition of matrix differences | 18:15 |
135 | Eigenvectors of distinct eigenvalues | 08:15 |
136 | Eigenvectors of repeated eigenvalues | 12:16 |
137 | Eigendecomposition of symmetric matrices | 14:04 |
138 | Eigenlayers of a matrix | 07:20 |
139 | Code challenge: reconstruct a matrix from eigenlayers | 20:11 |
140 | Eigendecomposition of singular matrices | 05:00 |
141 | Code challenge: trace and determinant, eigenvalues sum and product | 10:57 |
142 | Generalized eigendecomposition | 12:31 |
143 | Code challenge: GED in small and large matrices | 21:10 |
144 | Singular value decomposition (SVD) | 18:41 |
145 | Code challenge: SVD vs. eigendecomposition for square symmetric matrices | 24:32 |
146 | Relation between singular values and eigenvalues | 13:04 |
147 | Code challenge: U from eigendecomposition of A^TA | 18:24 |
148 | Code challenge: A^TA, Av, and singular vectors | 14:34 |
149 | SVD and the four subspaces | 07:35 |
150 | Spectral theory of matrices | 21:57 |
151 | SVD for low-rank approximations | 16:43 |
152 | Convert singular values to percent variance | 15:26 |
153 | Code challenge: When is UV^T valid, what is its norm, and is it orthogonal? | 12:04 |
154 | SVD, matrix inverse, and pseudoinverse | 13:30 |
155 | Condition number of a matrix | 12:48 |
156 | Code challenge: Create matrix with desired condition number | 15:09 |
157 | The quadratic form in algebra | 15:28 |
158 | The quadratic form in geometry | 15:36 |
159 | The normalized quadratic form | 06:36 |
160 | Code challenge: Visualize the normalized quadratic form | 16:21 |
161 | Eigenvectors and the quadratic form surface | 06:18 |
162 | Application of the normalized quadratic form: PCA | 29:02 |
163 | Quadratic form of generalized eigendecomposition | 17:34 |
164 | Matrix definiteness, geometry, and eigenvalues | 12:55 |
165 | Proof: A^TA is always positive (semi)definite | 06:52 |
166 | Proof: Eigenvalues and matrix definiteness | 07:16 |