We are at the beginning of the theory of harmonic analysis on finite abelian groups, starting with the concrete example of the the rank hypercube group, which we have chosen to define as the permutation group generated by the disjoint transpositions

Typically, analysts think of harmonic analysis on (or any other group) as the study of the Hilbert space of functions

equipped with the scalar product

so that one has a canonical orthonormal basis of consisting of indicator functions. This space has the structure of an algebra when equipped with the convolution product of functions,

which is exactly what one gets from bilinear extension of group law in lifted to the group element basis:

In particular, the convolution product on is quite different from the pointwise product. This is the right point of view to take when one is dealing with non-discrete groups (e.g. Lie groups), but for finite groups there is another way of writing things out which is often simpler and cleaner, and favored by algebraists.

The algebraists’ preferred Hilbert space of formal -linear combinations of permutations

Such linear combinations are of course equivalent to functions on the group, and you can think of the linear combination as a generating function which are equivalent to complex valued functions on From this perspective the canonical orthonormal basis of consists of the permutations themselves (rather than their indicator functions), which induces the equivalent scalar product,

The convolution product on is likewise replaced with the equivalent multiplication (written simply as juxtaposition)

In short, and are isomorphic both as Hilbert spaces and as algebras, so the choice to use one rather than the other is a matter only of preference. In practice, it is quite useful to be comfortable with both the analytic and the algebraic way of doing things.

Yet another way of repackaging everything is the following probably too-general perspective. Let us zoom back out to the very beginning, where we started with an arbitrary finite graph . The algebraic way to understand this combinatorial object is to replace the vertex set of with the algebra of polynomials in noncommuting variables indexed by the vertices of In other words, this is the free algebra of rank This algebra is graded by degree, and we have a decomposition

with the th summand being the space of homogeneous polynomials of degree . Observe that this vectors space is isomorphic to the space of complex-valued functions on i.e. the space of functions on -tuples of vertices, or equivalently formal linear combinations of -tuples of vertices. In particular, the linear component is isomorphic to either/both and and the adjacency operator acts on this linear component to capture the edge structure of Indeed, the free algebra is isomorphic to the tensor algebra of the vector space of functions on the vertex set In the case where the vertex set of of is actually a group we can mod out by the group law and consider the quotient

of the free algebra by the ideal corresponding to the group multiplication. This quotient consists solely of polynomials of degree so functions on the group, or equivalently linear combinations of group elements, but now with the structure of an algebra.

Coming back to reality, the fact that is an algebra and not just a vector space is reflected in the presence of a certain class of operators on the group algebra which are not present in the endomorphism algebra of a vector space devoid of multiplication.

**Definition 1:** Given the corresponding **multilplication operator** is defined by

We are using the letter “R” for multiplication operators because later on we will consider the group algebras of noncommutative groups, in which left and right multiplication do not necessarily coincide, and we have typically different left and right multiplication operators and To put this the Math 202B way, the operator is the image of in the right regular representation of

Note that multiplication operators corresponding to pure group elements (as opposed to linear combinations of group elements) are unitary operators on

However, since we are working with the permutation group in which all permutations are actually involutions, is both unitary and selfadjoint, hence has eigenvalues

The importance of multiplication operators in our quest to understand the eigenvalues and eigenvectors of the Cayley graph of as generated by the transpositions is the following.

**Proposition 1:** The adjacency operator of the Cayley graph of is the multiplication operator where

*Proof:* For any we have

— Q.E.D.

In view of Proposition 1, the problem of understanding the eigenvalues and eigenvectors of the Cayley graph is that of understanding the eigenvalues and eigenvectors of

To solve it, we will work out the spectral analysis of arbitrary multiplication operators on The main tool in this endeavor is as second basis of the group algebra, which consists of special linear combinations of group elements which have the property that

That is, is the generating function of a group homomorphism

the multiplicative group of nonzero complex numbers. I like this way of stating things; in the analytic formalism, the words “character” and “homomorphism” mean exactly the same thing, a redundancy, whereas in the algebraic formalism “character” really means “generating function of a homomorphism.”

In Lecture 7, we explicitly worked out all homorphisms from to showing that they have values in the subgroup generated by and are moreover parameterized by the points of We also showed that the characters are orthogonal,

In particular, for any we have the **character expansion**

The character expansion of is also known as its Fourier expansion, with being the **Fourier coefficients** of The main reason that character expansions are “better” than expansions in the group basis is that the Fourier coefficients of a product are just the products of the corresponding Fourier coefficients of and .

The main reason that characters — which are generating functions of homomorphisms — are better than the original group elements is that their multiplication is extremely simple.

**Theorem 1:** For any we have

if and

*Proof:* Expanding in the permutation basis, we have

where we used that is a homomorphism and The result now follows from character orthogonality, which is Theorem 2 in Lecture 7.

— Q.E.D.

An immediate consequence is that the Fourier coefficients of a product are the products of the Fourier coefficients of the factors.

**Corollary 1:** For any we have

*Proof:* On one hand, the character expansion of the product is

On the other hand, we compute the product and via their character expansions:

where we applied Theorem 1 to get the final equality. Since the characters are linearly independent, the result follows.

—- Q.E.D.

In Lecture 9, we will apply the above to prove that is a multiplication operator if and only if it acts diagonally in the character basis. We will also find the corresponding eigenvalues, and hence count walks on In Lecture 10 we will then analyze convergence of random walk on to equilibrium.

## 2 Comments