Suppose

.

That is, is a scalar function of a vector , given by taking the inner product of the two vector valued functions and . Now, we would like the gradient of , i.e. . What is it?

I frequently need to find such derivatives, and I have never been able to find any reference for rules to calculate them. (Though such rules surely exist somewhere!) Today, Alap and I derived a couple simple rules. The first answers the above question.

**Rule 1**.

If , then,

.

Here, is the Jacobian. i.e.

This rule is a generalization of the calculus 101 product rule, where (everything scalar), and .

A different rule concerning exponentials is

**Rule 2.**

If , then

.

Here, is the element-wise product. The strange product with can be understood as “copying” the first vector. That is, is a matrix where each column consists of . (The number of columns must be understood from context.)

**Rule 3.**

If , then

Surely there exists a large set of rules like this somewhere, but I have not been able to find them, despite needing things like this for years now. What I would really like to do is use a computer algebra system, such as Maple, Mathematica or Sage to do this, but that doesn’t seem possible at present. (It is suggested in that thread that Mathematica is up to the job, but I have tested it out found that not to be true.)

## One thought on “Vector Calculus Identities”