Since this is a course in applied algebra, let us begin with a real-world problem. Consider a deck of playing cards numbered arranged in a single row on a table. At each instant of discrete time, choose two cards independently at random and swap them. How long until you can be sure that the order of the cards is uniformly random?

As with every applied problem, in order to get started we have to think about how best to formulate the above as a precise mathematical question. This is what it means to “model” a real world problem.

First, each of the possible configurations of cards can be uniquely encoded as a permutation of the numbers — just write down the labels of the cards one at a time, from left to right, and you obtain a permutation written of written in one-line notation. This is a good first step: we have identified the state space of the system we wish to understand, and it is the set of permutations of

Next, we need to make precise the way in which our system evolves in time, i.e. transitions from one state to another at each time instant. If the configuration of cards at time is encoded by the permutation what are the possible states that our system can be in at time ? An obvious but in fact crucially important fact is that the answer to this question is not just combinatorial, but also algebraic: our configuration space is not just a set, it is a group, the symmetric group of rank . This means that the evolution equation we seek can be represented as multiplication in this group: given that the configuration of cards at time is the permutation the possible states at time are precisely the permutations

where denotes the transposition in which interchanges the numbers This is a good time to fix our notational conventions. Permutations in are bijective functions and multiplication of permutations means composition of functions. The concatenation of two permutations means the following:

Although this looks messy, the reason for taking this convention rather than the seemingly more natural one is that we are more interested in doing computations “upstairs” where permutations are elements in a group, as opposed to “downstairs” where they are functions acting on points, and it is nice to multiply strings of elements from left to right.

It is useful to take the above one step further, and think of the symmetric group as not just an algebraic object, but a geometric one, in the manner of geometric group theory. To do this, first solve the following problem.

**Problem 1: **The set of all transpositions generates That is, for any permutation , there exists a nonnegative integer and transpositions such that

With the solution to Problem 1 under our belts, we can define the word norm on corresponding to the generating set by

**Problem 2:** Prove that

Once we have a norm on meaning a way to measure the “size” of a permutation, it is very natural to define a corresponding notion of distance between two permutations by

**Problem 3:** Check that the pair satisfies the metric space axioms.

To summarize, the state space of the system we wish to understand is not just a set, but a group, and further not just a group, but a metric space. It is nice to have a pictorial representation of the metric space and this is afforded by the corresponding Cayley graph The vertex set of is and two vertices are adjacent if and only if there exists such that Below is a picture of the Cayley graph of as generated by

Let us note a few basic features of this graph. First, it is -regular, because Second, it is a graded graph, meaning that its vertex set decomposes as a disjoint union of independent sets,

where is the set of permutations whose disjoint cycle decomposition consists of cycles. Third, from the geometric perspective, the th level of the Cayley graph is the sphere of radius in centered at the identity permutation,

**Problem 4:** Prove that

At this point we have quite a clear and detailed understanding of the system we want to analyze: the state space of the system is the Cayley graph of as generated by the set of transpositions, and the evolution occurring is discrete time random walk on this graph. Now we have to make precise what exactly we mean by “random walk” in this sentence.

The most natural thing to consider is simple random walk: at time zero, we have a particle situated at the identity permutation on and at each instant of discrete time the particle jumps to a neighboring vertex, with equal probability of selecting any neighbor. We would like to understand how many jumps must be made before the probability to observe the particle at any vertex of the graph is the same as at any other vertex. This is what it means for a deck of cards to be shuffled: every configuration is equally likely, so that a person choosing a card from the deck cannot do anything more than guess what this card is going to be. Now, there is an immediate problem: this will never happen. The reason is annoying but unavoidable: every permutation is either even or odd, meaning that for any , all walks from the identity to have either an even or an odd number of steps. So, at even times, the particle must be situated in the alternating group and at odd times in its complement. But actually, this foible reminds us that we are trying to solve a real world problem in which two cards are selected independently and then swapped — it is possible that the two cards selected are actually the same, and no swap occurs.

**Problem 5:** Prove that the same card, no-swap event occurs with probability

Thus, what we really want to understand is the *lazy* random walk on where at each instant the particle either stays put with the above probability, or jumps to an adjacent vertex, with equal probability of jumping in any direction. Analyzing this model, Persi Diaconis and Mehrdad Shahshahani were able to prove the following landmark result in applied algebra.

**Theorem (Diaconis-Shahshahani):** A deck of cards is fully shuffled after transpositions.

Although the Diaconis-Shahshahani result is ostensibly a theorem in probability theory, one may legitimately refer to it as a theorem of applied algebra: its proof uses everything you learned about the representation theory of the symmetric groups you learned in Math 202B. Moreover, as far as the author knows, there is no way to obtain this result *without* using representation theory, i.e. by purely probabilistic reasoning.

In this course, we will use simple stochastic processes like card shuffling as motivating questions whose solution leads to the development of harmonic analysis on finite groups. This is a fascinating application of algebra, and taking this route leads not just through the character theory of finite groups, but all the way to Gelfand pairs, spherical functions, and other related algebraic constructions. Even though you already know the representation theory of so that in principle we could start with the Diaconis-Shahshahani theorem, we will instead begin with the analysis of random walks on much simpler groups, such as the discrete circle and hypercube. This motivates the development of character theory for finite abelian groups, where representations aren’t particularly meaningful, and sets the stage nicely for the harmonic analysis interpretation of the representation theory of general finite groups.

## 1 Comment