Here, you find my whole video series about Linear Algebra in the correct order and you also find my book that you can download for free. On this site, I also want to help you with some text around the videos. If you want to test your knowledge, please use the quizzes, and consult the PDF version of the video if needed. When you have any questions, you can use the comments below and ask anything. However, without further ado let’s start:
Part 1 - Introduction
Linear Algebra is a video series I started for everyone who is interested in calculating with vectors and understanding the abstract ideas of vector spaces and linear maps. The course is based on my book Linear Algebra in a Nutshell. We will start with the basics and slowly will climb to the peak of the mountains of Linear Algebra. Of course, this is not an easy task and it will be a hiking tour that we will do together. The only knowledge you need to bring with you is what you can learn in my Start Learning Mathematics series. However, this is what I explain in the first video.

With this you now know already some important notions of linear algebra like vector spaces, linear maps, and matrices. Now, in the next video let us define the first vector space for this course. Some more explanation, you can find in my book:
Part 2 - Vectors in $ \mathbb{R}^2 $
Let us start by talking about vectors in the plane:

Part 3 - Linear Combinations and Inner Products in $ \mathbb{R}^2 $
Now we talk about linear combinations, the standard inner product and the norm:

Part 4 - Lines in $ \mathbb{R}^2 $
With this, we are now able to define lines in the plane:

Part 5 - Vector Space $ \mathbb{R}^n $
Now we are ready to go more abstract. Let’s define a general vector space by listing all the properties such an object should satisfy. We can visualise this with the most important example.

Part 6 - Linear Subspaces
In the next video, we will discuss a very important concept: linear subspaces. Usually we just call them subspaces. They can be characterised with three properties.

Part 7 - Examples for Subspaces
I think that it will be very helpful to look closely at some examples for subspaces. Therefore, the next video will be about explicit calculations.

Part 8 - Linear Span
The next concept we discuss is about the so-called span. Other names one uses for this are linear hull or linear span. It simply describes the smallest subspace one can form with a given set of vectors.

Part 9 - Inner Product and Norm
As for the vector space $ \mathbb{R}^2 $, we can define the standard inner product and the Euclidean norm in the vector space $ \mathbb{R}^n $.

Part 10 - Cross Product
In the next part, we will look at an important product that only exists in the vector space $ \mathbb{R}^3 $: the so-called cross product.

Part 11 - Matrices
When we want to solve systems of linear equations, it’s helpful to introduce so-called matrices:

Part 12 - Systems of Linear Equations
After introducing matrices, we can now see why they are so useful. They can be used to describe systems of linear equations in a compact form.

Part 13 - Special Matrices
In the next video, we go back to matrices. We will discuss some important names for matrices, like square matrices, upper triangular matrices, and symmetric matrices.

Part 14 - Column Picture of the Matrix-Vector Product
Let’s continue talking about the important matrix-vector multiplication we introduced while explaining systems of linear equations. In the next video, we discuss the so-called column picture of the matrix-vector product.

Part 15 - Row Picture
Similarly, we can look at the rows of the matrix, which leads us to the row picture of the matrix-vector multiplication.

Part 16 - Matrix Product
Now, we are ready to define the matrix product.

Part 17 - Properties of the Matrix Product
After defining the matrix product, we can go into the details and check which properties for this new operation hold and which don’t.

Part 18 - Linear Maps (Definition)
Let’s go more abstract again: we will consider so-called linear maps. They are defined in the sense that these maps conserve the linear structure of vector spaces.

Part 19 - Matrices induce linear maps
By knowing what a linear map is, we can look at some important examples. It turns out that all matrices induce linear maps.

Part 20 - Linear maps induce matrices
The converse of the statement of the previous video is also true. All linear maps induce matrices. This is an important fact because it means that an abstract linear map can be represented by a table of numbers.

Part 21 - Examples of Linear Maps
Linear maps preserve the linear structure. This means that linear subspaces are sent to linear subspaces. Let’s consider some examples.

Part 22 - Linear Independence (Definition)
In the following video, we consider a new abstract notion: linear dependence and linear independence. We first explain the definition.

Part 23 - Linear Independence (Examples)
Now let’s consider examples of linearly independent families of vectors.

Part 24 - Basis of a subspace
Let’s discuss the concept of a basis.

Part 25 - Coordinates with respect to a Basis
Next, let’s talk how we calculate with bases. We define coordinates of vectors with respect to a chosen basis. Depending on the problem you want to solve, different bases might be helpful such that the coordinates you calculate with are simpler.

Part 26 - Steinitz Exchange Lemma
The next part will be more technical and about the so-called Steinitz Exchange Lemma. This will be used in some proofs later.

Part 27 - Dimension of a Subspace
After this technical proof, we are now able to define the concept of dimension. This is a natural number that describes the number of degrees of freedom in a subspace.

Part 28 - Conservation of Dimension
The dimension has a nice property that is conserved is some sense for linear maps.

Part 29 - Identity and Inverses
In the next videos, let us discuss some more concrete objects again. First, we look at matrices and define a special one: the identity matrix.

Part 30 - Injectivity, Surjectivity for Square Matrices
We recall the important notions for maps: injectivity and surjectivity. This concepts also hold for linear map and, therefore, can be transferred to matrices as well. Especially for square matrices, we find a very nice connection:

Part 31 - Inverses of Linear Maps are Linear
Let us quickly prove the important fact that, for bijective linear maps, the inverses are always also linear.

Part 32 - Transposition for Matrices
In the next video, we define a matrix operation: the transpose.

Part 33 - Transpose and Inner Product
The tranpose of a matrix could also be defined by using the standard inner product. This is an important relation that explains why the transpose is such a useful object.

Part 34 - Range and Kernel of a Matrix
The following two definitions are very important for the rest of the course: the range of a matrix and the kernel of a matrix. Both are defined as sets in a vector space and it turns out that they are actually subspaces.

Part 35 - Rank-Nullity Theorem
Let’s immediately use the definitions from above in order to formulate a key property of linear maps and matrices: the rank-nullity theorem. In addition, we will also be able to prove this fact.

Part 36 - Solving Systems of Linear Equations (Introduction)
Now we go back to our motivation while doing linear algebra: we want to solve systems of linear equations. This video is like an introduction into this huge topic. After getting the rough idea how the solving process should work, we will go into more details with later videos.

Part 37 - Row Operations
What we need to solve systems of linear equations are so-called row operations. These can be described by using invertible matrices that are multiplied from the left-hand side.

Part 38 - Set of Solutions
Here, we will define the set of solutions for a system of linear equations. It’s denoted by $ \mathcal{S} $.

Part 39 - Gaussian Elimination
This is one of the most important topics in this course. It is not theoretically complicated but the applications are everywhere. The Gaussian elimination is always needed when you want to solve a system of linear equations in an algorithmic way.

Part 40 - Row Echelon Form
Now, we are ready to define the important end result for the Gaussian elimination. This generalises a triangular form we have seen in former videos. It’s called row echelon form because the structure is given like a staircase.

Part 41 - Solvability of a System
Let’s discuss our result more abstractly now. We can always transform a system of linear equations into row echelon form. What does this say about the solvability of the system. It turn out that we can nicely formulate equivalent statements there.

Part 42 - Uniqueness of Solutions
The upcoming video will show in which cases we have a unique solution for a system of linear equations. This will be also the last video about the Gaussian elimination for now.

Part 43 - Determinant (Overview)
In the next videos, we will talk a lot about so-called determinants. First we will motivate them:

Part 44 - Determinant in 2 Dimensions
In this video, we will see why a determinant makes sense for solving systems of linear equations and how we can calculate this determinant in two dimensions.

Part 45 - Determinant is a Volume Measure
Here, we introduce the notion of a volume measure in arbitrary dimensions. Please note that in 2 dimension, this coincides with the common area function we already discusses in the last video. Hence, we already now which rules a general volume function should fulfil. It turns out that these rules already determine the volume measure.

Part 46 - Leibniz Formula for Determinants
From the last video we can conclude a nice formula for the volume measure and, therefore, also for the determinant.
