# Multivariable Calculus

Here, you find my whole video series about Multivariable Calculus in the correct order and I also help you with some text around the videos. If you have any questions, you can contact me and ask anything. However, without further ado let’s start:

#### Part 1 - Introduction

Multivariable Calculus is a video series I started for everyone who is interested in learning how to deal with partial derivatives, directional derivatives, and total derivatives. We discuss some important theorems like Taylor’s theorem and the Implicit Function Theorem. However, let us start with a quick overview:

###### Content of the video:

00:00 Intro
00:39 Prerequisites
02:15 Applications of the course
02:58 Content of the course
04:20 Credits

With this you now know the topics that we will discuss in this series. Some important bullet points are partial derivatives, directional derivatives, total derivatives and Lagrange multipliers. In order to describe these things, we need to generalise a lot from one variable to several variables. Now, in the next video let us discuss continuity.

#### Part 2 - Continuity

In the next part, we will extend the definition of continuity to functions of the form $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$. For this we will need a notion of distance in these higher-dimensional spaces. Therefore, we will define the so-called Euclidean distance.

###### Content of the video:

0:00 Intro
0:25 Continuous Functions
1:45 Continuity via sequences
2:39 Measuring distance in ℝⁿ
5:31 Convergent sequences in ℝⁿ
8:20 (Non-trivial) Link between single-variable convergence definition vs. new definition
10:33 Multivariable continuity

#### Part 3 - Examples of Continuous Functions

In the next part, we will look at some examples. Mostly, we show that continuity can fail at a point even if some sequences suggest continuity there. Click at the button to find the codes we used in Python to plot the 3D graphs.

#### Part 5 - Total Derivative

In the next video, we generalise the notion of a linear approximation as we know it for functions $f: \mathbb{R} \rightarrow \mathbb{R}$ to multivariable functions $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$. This leads to the notion of the total derivative.

#### Part 6 - Partially vs. Totally Differentiable Functions

Next, we look at the difference between the terms partially differentiable and totally differentiable.

#### Part 7 - Chain, Sum and Factor rule

A lot of calculation rules from the one-dimensional case can be translated to this multivariable setting. In fact, the sum rule and the factor rule look exactly the same. However, the chain rule has to be modified a little but it stays one of the most important rules in calculations.

In the next video, we will introduce the so-called gradient for functions $f: \mathbb{R}^n \rightarrow \mathbb{R}$.

#### Part 9 - Geometric Picture for the Gradient

Let’s continue the discussion from the last video by visualising the gradient for functions in $\mathbb{R}^2$. This is a geometric view for the gradient which can be very helpful.

#### Part 10 - Directional Derivative

Now, we extend our inventory of different derivatives. We define the directional derivative for functions $f: \mathbb{R}^n \rightarrow \mathbb{R}$ and vectors $\mathbf{v} \in \mathbb{R}^n$.

#### Part 11 - Gradient is Fastest Increase

You will also hear that the gradient gives the direction of the fastest increase. In the following video, we will prove this important fact.

#### Part 12 - Second Order Partial Derivatives

In the next video, we finally introduce higher-order partial derivatives.

#### Part 13 - Schwarz’s Theorem

After defining the second-order derivatives, we now can show that the order for calculating the partial derivatives does not matter under some mild assumptions. This is also known as Schwarz’s Theorem, named after the German mathematician Hermann Amandus Schwarz.

#### Part 14 - Vector Fields and Potential Functions

As an application of Schwarz’s theorem, we can look at so-called potential functions for vector field. It turns out that there is a necessary condition for the existence of such functions.

#### Part 15 - Multi-Index Notation

For later and for a lot of formulas, we will need a more compact notation for partial derivatives. A very useful possibility is the so-called multi-index notation. There one just uses tupels of integers and some convenient interpretations of symbols concerning multi-indices.

#### Part 16 - Taylor’s Theorem

As a generalisation of the one-dimensional case, we can form polynomial approximations of differentiable functions. This is known as Taylor’s theorem.

#### Part 17 - Taylor’s Theorem - Examples

Now, we can also look at some examples for Taylor polynomials. We also discuss the second-order Taylor polynomial in more detail. In order to do this, we will introduce the so-called Hessian matrix, which includes the second-order partial derivatives.

#### Part 18 - Local Extrema

Similarly to the definition of local extrema for functions $f : \mathbb{R} \rightarrow \mathbb{R}$, we can define them for functions $f : \mathbb{R}^n \rightarrow \mathbb{R}$ as well. It turns out that necessary and sufficient conditions also look the same. We just have to find the correct substitutions and translations to do it.

#### Part 19 - Examples for Local Extrema

Let’s make everything more concrete and look at some particular examples.

#### Part 20 - Sylvester’s Criterion

If you know some linear algebra, the Sylvester Criterion is the way to go to check for local extrema.

###### Content of the video:

00:00 Intro
00:54 Assumptions of Sylvester’s Criterion
02:07 Sylvester’s Criterion for positive definite matrices
03:57 Sylvester’s Criterion for negative definite matrices
04:45 Proof for diagonal matrices
06:34 Example calculation
08:57 Credits