The Arithmetic Mean – Geometric Mean inequality says that if you have numbers then
. Another way to say this is that, given a set of nonnegative numbers, their geometric mean is always less than or equal to their arithmetic mean.
Before we look at some examples, let’s think about what this means geometrically. If you raise both sides to the , you get
. You can think of both sides of this as the area of an
dimensional box. And this statement says that given sides lengths of a fixed sum, to maximize the volume of the box, you should make each side length the same. That side length is
.
Now let’s look at some examples where we can apply this theorem.
If is a positive semidefinite matrix (meaning that its eigenvalues
are nonnegative), then
. The product of the eigenvalues of a matrix is
, and sum of the eigenvalues is
. That means we can write that inequality as
. We can raise both sides to the
to get a bound on
. We get that, for a positive semidefinite matrix
.
This is nice because the determinant is really expensive to calculate, but the trace is very easy to calculate as the sum of the diagonal entries of . So this gives an upper bound on
that is easy to obtain.
Let’s consider some positive semidefinite matrices that arise in applications and see what this inequality can tell us.
Let be a nonlinear function. Then the Jacobian of this nonlinear transformation is
. If the Jacobian matrix is positive semidefinite we can apply the inequality from earlier. It will tell us that
. The right hand side of this can be rewritten as
, but this is the same thing as
, where
is the divergence of a vector field. So we get that
. This is interesting because the determinant of a linear transformation essentially tells you the volume that its basis vectors occupy, and this inequality says that this quantity is bounded above by a function of
.
tells you whether
behaves like a source or a sink at a point, so this inequality relates the volume occupied by the linearization of
to extent to which
behaves like a source or a sink at a point.
Now, suppose is a convex function. Then the Hessian matrix
is given by
is positive semidefinite. Therefore
. Like in the previous example, we can simplify the numerator. The trace of
is
, but this is just
, the Laplacian of
. So this shows that for a convex function (Hessian is positive semidefinite) with Hessian matrix
,
. If you compute
at a critical point of the function
(a place where
), you get the Gaussian curvature of the function at that point. That means that this inequality gives an upper bound on the Gaussian curvature at a critical point in terms of
.
Hopefully this gives some insight into how the Arithmetic Mean – Geometric Mean Inequality comes up in different areas of math!