*Here, you find my whole video series about Multivariable Calculus in the correct order and I also help you with some text around the videos. If you have any questions, you can contact me and ask anything. However, without further ado let’s start:*

### Introduction

**Multivariable Calculus** is a video series I started for everyone who is interested in learning how to deal with partial derivatives, directional derivatives, and total derivatives. We discuss some important theorems like Taylor’s theorem and the Implicit Function Theorem. However, let us start with a quick overview:

With this you now know the topics that we will discuss in this series. Some important bullet points are **partial derivatives**, **directional derivatives**, **total derivatives** and **Lagrange multipliers**. In order to describe these things, we need to generalise a lot from one variable to several variables. Now, in the next video let us discuss **continuity**.

### Let’s get started

In the next part, we will extend the definition of **continuity** to functions of the form $ f: \mathbb{R}^n \rightarrow \mathbb{R}^m $. For this we will need a notion of distance in these higher-dimensional spaces. Therefore, we will define the so-called **Euclidean distance**.

In the next part, we will look at some **examples**. Mostly, we show that continuity can fail at a point even if some sequences suggest continuity there. Click here to find the codes we used in Python to plot the 3D graphs.

Now, we are finally ready to talk about derivatives in several variables. We start with the notion of **partial derivatives**.

In the next video, we generalise the notion of a linear approximation as we know it for functions $ f: \mathbb{R} \rightarrow \mathbb{R} $ to multivariable functions $ f: \mathbb{R}^n \rightarrow \mathbb{R}^m $. This leads to the notion of the **total derivative**.

Next, we look at the difference between the terms **partially** differentiable and **totally** differentiable.

A lot of calculation rules from the one-dimensional case can be translated to this multivariable setting. In fact, the **sum rule** and the **factor rule** look exactly the same. However, the **chain rule** has to be modified a little but it stays one of the most important rules in calculations.

In the next video, we will introduce the so-called **gradient** for functions $ f: \mathbb{R}^n \rightarrow \mathbb{R} $.

Let’s continue the discussion from the last video by visualising the gradient for functions in $ \mathbb{R}^2 $. This is a **geometric view for the gradient** which can be very helpful.

Now, we extend our inventory of different derivatives. We define the **directional derivative** for functions $ f: \mathbb{R}^n \rightarrow \mathbb{R} $ and vectors $ \mathbf{v} \in \mathbb{R}^n$.

You will also hear that the gradient gives the **direction of the fastest increase**. In the following video, we will prove this important fact.

In the next video, we finally introduce **higher-order partial derivatives**.

After defining the second-order derivatives, we now can show that the order for calculating the partial derivatives does not matter under some mild assumptions. This is also known as **Schwarz’s Theorem**, named after the German mathematician Hermann Amandus Schwarz.