Let be a morphism with left and right singular blocks
and let be the corresponding singular values: the restriction of
to
has the form
where
is an isometric isomorphism. As a decomposition of
itself, we have
where is the orthogonal projection of
onto the left singular block
. We have seen previously that
is an orthogonal set in
with respect to the Frobenius scalar product: we have
In particular, we have the Frobenius norm
Previously, we wrote this in terms of repeated singular values, meaning that we decomposed each and
into a direct sum of orthogonal lines instead of incorporating the multiplicity factor
The two ways of thinking about the SVD — blocks and lines — are both correct, but we have to keep in mind that the block decomposition is unique whereas the line decomposition is only unique when the number
of singular values
is equal to the rank of
In particular, if we want to use the SVD to define the adjoint, we should use the block version.
Definition 22.1. The adjoint of is the morphism
defined by
where is the orthogonal projection of
onto the right singular block
A particularly simple (but important) situation is when comes to us as an isometric isomorphism. Then, the singular value decompositions of
and
are
the singular value decomposition of is
and
It is clear from Definition 22.1 that the adjoint is an involution, Geometrically, this just says that swapping left and right singular blocks twice is the same as not swapping them at all.
Now we consider the case where the source and target spaces of coincide, meaning that
and
In this setting there are two further coincidences that could potentially occur: the left and right singular blocks of
could coincide,
or
could coincide with its adjoint,
In the first case we say that
is normal, and in the second we say that
is selfadjoint. A priori, it is not clear what the relationship between normality and selfadjointness might be. Of course, we already know the answer from prolonged exposure to matrix algebra.
Theorem 22.2. Selfadjoint implies normal.
I would like to give a geometric proof of the fact that selfadjointness,
implies normality,
This amounts to showing that
forces in
, for each
I have not yet succeeded in finding such an argument, and while I plan to try again on my own time you may be getting tired of me constantly turning things upside down and inside out and generally treating linear algebra results we can look up as if they were open research problems. If so, you may have given up on Math 202A, which is fine with me provided you are working on the mathematics that matters to you. I mean this sincerely and will extend the following provision: if you have fallen behind on 202A homework sets because you are composing your mathematical opus, you can turn in that work for homework credit in 202A.
Look up the definition of a normal matrix in any textbook and you’ll get an algebraic characterization of normality which makes Theorem 22.2 obvious.
Theorem 22.3. is normal if and only if
Problem 22.1. Prove that selfadjoint implies normal. You have two options: either find the direct geometric argument I failed to produce, or prove the standard result Theorem 22.3 and deduce the claimed implication from it.
Now, on to the spectral theorem. Assume is selfadjoint. Then, because
is normal, its left and right singular blocks coincide. The SVD tells us that the restriction of
to the singular block
is
where is an isometric automorphism of
, aka a unitary operator on
The same reasoning applied to
gives
Since we have
and since we can cancel it to get
In summary, the SVD is reducing the study of selfadjoint operators to the study of unitary selfadjoint operators: we have shown that being selfdadjoint means that
where is both selfadjoint and unitary. So let us step back and consider the highly constrained problem of analyzing a selfajdoint unitary operator
Since is unitary,
, and since
is selfadjoint,
Combining these gives
and we conclude that a selfadjoint unitary operator
satisfies
where is the identity operator on
Two obvious solutions to this equation are
and latex
and in fact all solutions are mixtures of these two. More precisely, let
and
Theorem 22.4. We have
Note that it is *not* necessarily true that both and
are nonzero subspaces, as the examples
and
already show.
Now return to the general situation, where we know that is selfadjoint, but we are not assuming it is also unitary. What we do know is that
where the restriction of to
is
with
selfadjoint and unitary. This gives us the decomposition
where acts in
as multiplication by
acts in
as multiplication by
In words, the condition that
allows us to refine its SVD by decomposing each singular block
into a positive part
and a negative part
and since
is not the zero space at least one of
is nonzero.
Theorem 22.5. (Spectral Theorem) Given selfadjoint, there exists an orthogonal decomposition
where are nonzero subspaces such that
where are nonzero real numbers. Each
is a signed singular block of
and each
is a signed singular value of