In this lecture I wanted to connect our (admittedly meandering) discussion so far to the standard approach to limit theorems in random matrix theory. Let be a random Hermitian matrix with characteristic function
This means precisely that the distribution of in
with Hilbert-Schmidt scalar product is the standard Gaussian measure on this Euclidean space.
Definition 14.1. The Gaussian Unitary Ensemble is the sequence
The word “unitary” here means that the distribution of is unitarily invariant. Let $B_{N1} \geq \dots \geq B_{NN}$ be the eigenvalues of
Theorem 14.1 (Wigner). For each the limit
exits and is given by
In Theorem 14.1, is the power sum symmetric polynomial of degree
, and
is the
th Catalan number. There are two things we should discuss: what the result means, and how to prove it.
Concerning the meaning of Theorem 14.1, the random variable
is the degree moment of the empirical eigenvalue distribution of
which is the random discrete probability measure
on
which places equal mass at each eigenvalue. Thus, Theorem 14.1 is saying that each moment of the empirical eigenvalue distribution of the scaled random matrix
converges in expectation to an explicit limit. It is therefore natural to ask whether
is the moment sequence of a probability measure on
To answer this, one can recognize that the exponential generating function
is essentially a Bessel function. Then, a classical integral representation for Bessel functions allows one to determine that is the moment sequence of the probability measure
on
which is supported on
with density
You can find the details of this derivation here, towards the end of Lecture One.
Now we discuss how one can prove Theorem 14.1. The proof uses the fact that power sums in the eigenvalues of a matrix are also traces of powers of that matrix. This means that we have
the point being that the right hand side is also a polynomial in the elements of the Gaussian random matrix
. Indeed, by matrix algebra we have
where the sum is over all functions
Using the fact that the matrix elements of are Gaussian and independent (up to selfadjointness), the following can be established.
Theorem 14.2 (Harer-Zagier). For each we have
where is the number of ways to glue the edges of a
-gon in pairs so as to produce a compact orientable surface of genus
The proof is a beautiful piece of mathematics and a nice treatment can be found here. Theorem 14.2 implies Theorem 14.1 because the number of ways to glue a sphere from a polygon with
sides is
and only the genus zero term survives in the
limit.
There is something unsatisfying about the above sequence of ideas. We know that the entire distribution of is completely determined by the joint distribution of its diagonal matrix elements
which are simply iid standard Gaussians. In fact, at this point in the course we know much more: the spectral analysis of unitarily invariant random Hermitian matrices is equivalent to the analysis of pairs
and
of triangular arrays of real random variables related by
where is a uniformly random unitary matrix. Initially,
were viewed as the diagonal matrix elements of a random Hermitian matrix with unitarily invariant distribution and
were the eigenvalues of this selfadjoint matrix. But, we can in fact forget about this original setup entirely, and simply be probabilists thinking about two randomly coupled triangular arrays of real random variables. The question then how to recover Theorem 14.1 simply from the given fact that our
-array has rows
of iid standard Gaussians.
This point of view is why the paper of Olshanski-Vershik is so relevant for our purposes, and so far ahead of its time: they are proving inverse theorems of this kind. More precisely, they are considering the case where the -array is a given deterministic data set, and proving limit theorems about the corresponding
-array. A special case of their main result is the following “inverse Wigner theorem.”
Theorem 14.3 (Olshanski-Vershik). Suppose that the limit
exists for each Then, for any fixed
, the random variables
converge in distribution to an
-tuple
of iid centered Gaussians with variance
We have proved the one-dimensional version of this Gaussian limit theorem (the case ), and we will finish the proof of the multivariate version next time (the case
). In our approach to these inverse results of Olshanski-Vershik we are using a combinatorial result which is analogous to Theorem 14.2, but arguably even more miraculous: a universal genus expansion for joint cumulants of the
-array in our theory of unistochastically coupled triangular arrays.