Jorge Valle

Jorge Valle

Jorge Valle is a front end developer with a particular passion for, and expertise in, JavaScript and user interfaces. Lately, he's also been diving into machine learning.

Calculating the Jacobian of a multivariable function

In an effort to better understand the underlying mathematics behind deep learning algorithms, I have been studying the Jacobian matrix and its applications in differential calculus.

The Jacobian of a multivariable function is a matrix in which each element is the partial derivative of the function with respect to each variable.

Therefore, in order to calculate the Jacobian, we first differentiate the multivariable function with respect to each variable. Let's take the following function as an example.

$$ f(x, y, z) = x^{3} - 2y + \sin({z}) $$
Figure 1: A multivariable function.

We want to find its Jacobian.

First, we have to differentiate with respect to $x$. Using the power rule, we bring the power term to the front, and reduce the power by $1$.

$$ \frac{\partial f}{\partial x} = 3x^{2} $$
Figure 2: Differentiating with respect to x.

Then we differentiate with respect to $y$, so $-2y$ just becomes a constant.

$$ \frac{\partial f}{\partial y} = - 2 $$
Figure 3: Differentiating with respect to y.

Lastly, we differentiate with respect to $z$, so $\sin$ differentiates to $\cos$.

$$ \frac{\partial f}{\partial z} = \cos({z}) $$
Figure 4: Differentiating with respect to z.

Once we have all the partial derivatives, we construct the Jacobian matrix with each partial derivative as an element. In this case, because we are ending up with a $1 \times 3$ matrix, we call it a vector. We can refer to it as the Jacobian vector of our original multivariable function.

$$ J = [3x^{2}, - 2, \cos({z})] $$
Figure 5: Building the Jacobian from all partial derivatives.

Books, patrons and coffee in Caffè San Marco, Trieste