A key concept is that there is a third fundamentally important subspace associated to every morphism . Unlike
and
this subspace is geometric rather than algebraic in nature: it is defined as the set
of vectors in which saturate the operator norm inequality for
It is not obvious that this actually is a vector space, and the following result is both crucial and nontrivial.
Theorem 5.1. is a nonzero subspace of
The fact that contains a unit vector follows from the extreme value theorem. We found a way to use the parallelogram law to show that
is a subspace of
The essence of the singular value decomposition is understanding the relationship between the three fundamental subspaces
a geometric endeavor which goes beyond the algebraic relationship between kernel and image given by the rank-nullity theorem. In order to succeed in this question, one needs the following fact.
Orthogonality Lemma. Two nonzero vectors are orthogonal if and only if
for all
.
Proof: Geometrically, the Orthogonality Lemma says that are orthogonal if and only if the point on the affine line
closes to
is
. In fact, this characterization remains correct even in the infnite-dimensional setting.
Suppose first that and
are orthogonal. Then, for any scalar
we have
Conversely, suppose that the nonnegative function
is minimized by taking the minimum being
Suppose
and look at
We then compute
a contraction.
Armed with the Orthogonality Lemma, we turn to an examination of the geometric relationships between the three fundamental subspaces of a linear transformation. The first such relationship is the nice fact that maximally stretched vectors are orthogonal to squashed ones.
Theorem 5.2. Assuming we have
Proof: Since is a nonzero subspace of
, we can choose a nonzero vector
If the kernel of
is trivial, the statement we want to prove is obvious. If not, we can choose a nonzero vector
. Now use the Orthogonality Lemma on the affine line
Now let us see how Theorem 5.2 refines what the rank-nullity theorem tells us. We have the orthogonal decompositions
Rank-nullity says that the restriction of to a morphism
is an isomorphism. Theorem 5.2 tells us that sitting inside
we have the space
of maximally stretched vectors. The best case scenario is that
exhausts
Then, we learn that the isomorphism
is almost an isometric isomorphism: it is
times the isometric isomorphism
given by
If we are not so lucky, then we have
where the orthogonal complement of
inside
is a nontrivial subspace of
Now we have to try again: this is the start of an iterative process which terminates after finitely many steps, the output being the Singular Value Decomposition, which is adequately treated in the main notes. If you understand this process, you understand everything about linear algebra that you need to know for Math 202A.