Before leaving the world of linear algebra for multilinear algebra (i.e. tensors), let us reformulate the spectral theorem entirely “upstairs” in (as in Lecture 1, is complex vector space of dimension equipped with a scalar product).

**Theorem 1:** If is selfadjoint, then there is a positive integer real numbers and selfadjoint operators in satisfying

and

such that

The operators above are called **orthogonal idempotents**. The “idempotent” part is , and this term is applied more generally in ring theory to refer to anything equal to its own square. The “orthogonal” part is , the zero operator in , when This is a useful representation of , because orthogonal idempotence immediately implies that

Given your Math 202A background, you have realized immediately that Theorem 1 is equivalent to the eigenvalue/eigenvector form of the Spectral Theorem discussed in Lecture 1, because the set of selfadjoint idempotents acting in is in canonical bijection with the lattice of subspaces of , since these are exactly the operators which act by orthogonal projection onto a subspace: for any , the corresponding maps to the nearest point of the subspace . The correspondence between selfadjoint idempotents — which are often simply called orthogonal projections — and subspaces even holds in the infinite-dimensional setting provided is complete with respect to the norm , i.e. is a Hilbert space, provided we restrict to *closed* subspaces. Thus the Grassmannian is for all practical purposes equivalent to the set of selfadjoint idempotents/orthogonal projections of rank . Theorem 1 above is obtained from the eigenvalue/eigenvector form of the Spectral Theorem by replacing the eigenspaces of with the orthogonal projections onto those spaces, and conversely Theorem 1 gives the eigenvalue/eigenvector form of the Spectral Theorem with being the distinct eigenvalues of whose corresponding eigenspaces are the images of the orthogonal idempotents .

Let us close out this brief recap of eigenvectors and eigenvalues (i.e. Math 202A material) by recalling the algebraic criterion for diagonalizability. An operator is said to be normal if it commutes with its adjoint, . Obviously, selfadjoint operators are normal, , and so are unitary operators since .

**Theorem 2:** Given , there exists an orthonormal basis of such that for each , where are scalars, if and only if is normal. Equivalently, is a -linear combination of pairwise orthogonal selfadjoint idempotents if and only if .

After revisiting this essential piece of 202A technology, my plan was to jump into multilinear algebra — tensor products, exterior products, symmetric products, and their applications to eigenvalue inequalities — and work up to the construction of Schur functors, which interpolate between these using the representation theory of the symmetric groups. This would be an application of the material you learned in 202B wherein the goal is to enhance the material you learned in Math 202A. However, I have since changed my mind. After reviewing Professor Sam’s 202B lecture notes (which I am posting to Piazza for your reference), and speaking briefly with Steven, I have realized that you already have some exposure to this material. Even though there is much left to be done in this direction, namely starting from Section 6 in Steven’s notes and working up to a proof of Theorem 6.4.1 via Schur-Weyl duality, I have ultimately decided to leave this for the second half of the course, and begin instead with some very natural and appealing applications of symmetric group representations which we can sink our teeth into immediately. Also, this will give you exposure to material which cannot be looked up in textbooks, or at least cannot be found presented in the way we’ll do it.

**Definition 1:** For each , the **symmetric group** of degree consists of the set of bijective functions

i.e. automorphisms of , with the group law being composition of functions. Elements of are called **permutations**.

As you know from Math 202B (and likely much before), any permutation can be factored into a product of disjoint cyclic permutations, and this factorization is unique up to the order of the factors. Our starting point will be a rather different unique factorization theorem for permutations. To explain, let us consider from the perspective of geometric group theory. Let denote the set of transpositions, i.e is the conjugacy class of -cycles,

As you are also no doubt aware, generates , which means that any permutation can be written as a product of transpositions,

This representation is very far from being unique, and indeed the question of counting the number of factorizations of a given permutation into a given number of transpositions is a classical problem in algebraic combinatorics known as the Hurwitz enumeration problem; despite its elementary formulation, this question has deep ties to enumerative algebraic geometry and mathematical physics, some of which we will hopefully be able to discuss. Nevertheless, for each there is a minimal value of for which such a factorization exists, and this is called the word norm of and denoted , where the subscript reminds that this quantity depends on the choice of generating set We will omit this subscript in order to lighten the notation, and if we switch to a different generating set inducing a different word norm on , we will say so loud and clear. The following is elementary combinatorics, but you should take the time to prove it.

**Proposition 1:** The length of any factorization of a permutation into transpositions satisfies for some . That is, all factorizations of into transpositions have the same parity.

Somewhat less elementary but no less important is the fact that there is a tight connection between the word norm of a permutation and the number of factors in its unique decomposition into disjoint cycles.

**Proposition 2 (join-cut relation):** We have

The key idea in the proof of Proposition 2 is expressed in its name: multiplying a permutation by a transposition will either join two of its cycles into one, or cut one of its cycles into two. Thus,

The word norm makes the symmetric group into a metric space space.

**Proposition 3:** The function

defined by | is a metric.

We may thus for example speak of the open ball of radius centered at a given permutation ,

and its boundary, the sphere

The metric is nothing but the graph theory distance (or geodesic distance) on the Cayley graph of the symmetric group as generated by the conjugacy class of transpositions. The vertex set of this graph is itself, and two permutations are joined by an edge if and only if there is such that Note that this is an undirected graph because is the identity permutation for every so that is equivalent to Note also that is a regular graph of degree Another feature of is that it is a graded graph: it decomposes into a disjoint union of independent sets or “levels,”

with being the sphere of radius centered at the identity permutation . By the join-cut relation, may also be described as the set of permutations whose disjoint cycle decomposition contains factors.

The Hurwitz enumeration problem has an equivalent formulation in terms of the Cayley graph: it asks for the number of -step walks from to on This description also points to a refinement of the Hurwitz problem. Let us introduce the edge-labeling on in which each edge corresponding to the transposition is marked by , the larger of the two numbers it interchanges. We will call this the Biane-Stanley edge labeling of the symmetric group, after Philippe Biane and Richard Stanley, in whose work it appears implicitly.

Given an -step walks on , we define its **signature** to be the corresponding sequence of edge labels. We say that a walk is **strictly monotone** if its signature is a strictly increasing sequence. Thus, an -step strictly monotone walk from to is the same thing as a strictly monotone factorization of into transpositions, i.e. a factorization

such that

**Theorem (unique monotone factorization): **Every permutation admits a unique strictly monotone factorization, and the length of this factorization is equal to That is,

*Proof: *The proof is by induction on .

In the base case we have with and The complete answer to the Hurwitz enumeration problem for is thus given by

and

This corresponds to the fact that if and have the same parity, then we have the unique length factorization

but no factorization exists if and have opposite parity. Imposing the strictly monotone constraint whittles this down to

and

For the inductive step, there are two cases to consider. Let and let be any permutation. If is a fixed point of , i.e. , then has a unique strictly monotone factorization by the induction hypothesis. If on the other hand is one of the

permutations which contains in its support (meaning ), then is of the form

for a unique pair and By the induction hypothesis, admits a unique strictly monotone factorization of length equal to , and the result follows.

—- Q.E.D.

Graphically, the unique monotone factorization theorem gives a presentation of the symmetric group as a starlike tree, which we call the Jucys tree after the Lithuanian physicist Algimantas Adolfas Jucys (not to be confuse with his father, the Lithuanian physicist Adolfas Jucys), in whose work it appears implicitly. We will embed this construction in the group algebra in Lecture 3, and start using it in conjunction with linear representations of the group algebra.