Orthogonal Complements and Rank-Nullity

In the last post, we talked about what the orthogonal complement of a set is and we showed why $Row(A)\perp Nul(A)$. This was kind of cool because for me, it’s hard to have a geometric understanding of the row space of matrix based on how matrix multiplication is usually defined ($Col(A^\perp)$ is kind of hard to picture) but it’s not that hard for me to have a geometric understanding of the null space of a matrix. The fact that they are orthogonal means that if you can understand what it means to be in the null space, then to be in the row space just means that you are orthogonal to everything in the null space.

Now, let’s talk about how to connect the fact that the $Row(A)\perp Nul(A)$ to the Rank-Nullity Theorem. As a reminder, the Rank-Nullity Theorem says that for an $m\times n$ matrix $A$, $dim(Col(A))+dim(Nul(A))=n$.

The first fact that we need to talk about before starting to understand why this is true is that for an $m\times n$ matrix $A$, $dim(Row(A))=dim(Col(A))$. This should seem really surprising at first. The row space of $A$ lives in $\mathbb{R}^m$ and the column space of $A$ lives in $\mathbb{R}^n$! So, the dimension of the row and column spaces are the same, but they don’t even live in the same ambient dimensions! That’s pretty crazy, if you ask me.

Now, let’s look at an example to see why this makes sense. Consider the matrix $A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \end{pmatrix}$. This represents a transformation from $\mathbb{R}^3$ to $\mathbb{R}^2$, and every output is of the form $(a_1+2a_2+3a_3)\begin{pmatrix}1\\2\end{pmatrix}$, where $a_i$ represents any real number. In this case, the output lives in $\mathbb{R}^2$, but the output itself is only $1-$dimensional (since there’s only one vector that spans the output).

Now, let’s consider the matrix $A^T=\begin{pmatrix}1&2\\2&4\\3&6\end{pmatrix}$. This represents a transformation from $\mathbb{R}^2$ to $\mathbb{R}^3$, and every output is of the form $(b_1+2b_2)\begin{pmatrix}1\\2\\3\end{pmatrix}$, where $b_i$ represents any real number. In this case, the output lives in $\mathbb{R}^3$, but the output itself is only $1-$dimensional (since there’s only one vector that spans the output).

So, even though the transformations given by $A$ and $A^T$ don’t map into the same dimension, the same number of vectors can still be used to represent their outputs! That’s so cool!

Now, let’s connect this to the Rank-Nullity Theorem. The Rank-Nullity Theorem gives a connection between the dimension of the column space of matrix and the dimension of the null space of a matrix. We just did an example that showed why dimension of the column space of a matrix is the same as the dimension of the row space of a matrix. So, if $A$ is an $m\times n$ matrix, let’s rewrite the Rank-Nullity Theorem as $dim(Row(A)) + dim(Nul(A)) = n$. This has a nice geometric interpretation since any deficiency in the size of the row space is made up for by the size of the null space. It’s amazing how nicely they work together.

I hope this made you appreciate what it really means to be a matrix just a little bit more!