Math 202C: Lecture 8

As in Lecture 7, let \mathrm{Par}_n denote the set of nonnegative integer vectors \mu=(\mu,\dots,\mu_n) with weakly decreasing coordinates. Then, each \mu \in \mathrm{Par}_n can be viewed as a partition of the number

|\mu|=\mu_1+\dots+\mu_n

having at most n parts. Set

\mathrm{Par}_n^d = \{\mu \in \mathrm{Par}_n^d \colon |\mu|=d\}.

We know that the monomial symmetric polynomials

\{m_\mu(x_1,\dots,x_n) \colon \mu \in \mathrm{Par}_n^d\}

form a basis of the degree d component \mathcal{S}_n^d of the algebra \mathcal{S}_n=\mathbb{C}[x_1,\dots,x_n]^{S_n} of symmetric polynomial in n commuting selfadjoint variables x_1,\dots,x_n.

The fundamental theorem of symmetric polynomials gives a second basis of \mathcal{S}_n^d constructed from the elementary symmetric polynomials

e_r(x_1,\dots,x_n) = \sum\limits_{\substack{I \subseteq [n] \\ |I|=r}} \prod\limits_{i \in I} x_i.

Let \mathrm{Par}^d denote the set of all partitions of d, and observe that we may think of this as the set of all nonnegative integer vectors \lambda=(\lambda_1,\dots,\lambda_d) with weakly decreasing coordinates whose total sum |\lambda| equals d. We extend the definition of the elementary symmetric polynomial multiplicatively by defining

e_\lambda(x_1,\dots,x_n)=\prod\limits_{i=1}^d e_{\lambda_i}(x_1,\dots,x_n),

where by convention e_0(x_1,\dots,x_n)=1. Then, in order for e_\lambda(x_1,\dots,x_n) not to be the zero polynomial it is necessary and sufficient that \lambda_1 \leq n, which is equivalent to the condition \lambda^\prime \in \mathrm{Par}_n. The fundamental theorem of symmetric polynomials says that

\{e_\lambda(x_1,\dots,x_n) \colon \lambda^\prime \in \mathrm{Par}_n^d\},

is a basis of \mathcal{S}_n^d. Equivalently, the claim is that

\{e_{\lambda^\prime}(x_1,\dots,x_n) \colon \lambda\in \mathrm{Par}_n^d\}

is a basis of \mathcal{S}_n^d.

In Lecture 7, we assembled the main ingredients needed to establish this. First, we showed that for any \lambda \in \mathrm{Par}^d we have

e_\lambda(x_1,\dots,x_n) = \sum\limits_{\mu \in \mathrm{Par}_n^d} M_{\lambda\mu}m_\mu(x_1,\dots,x_n),

where M_{\lambda\mu} is the number of d \times n matrices with entries in \{0,1\} whose row sums are encoded by \lambda=(\lambda_1,\dots,\lambda_d) and whose column sums are encoded by \mu=(\lambda_1,\dots,\lambda_n). Note that this is consistent with what we have seen: if \lambda_1>n, then there cannot be a \{0,1\}-matrix with n columns whose first row sums to \lambda_1, and all the coefficients M_{\mu\lambda} are zero. Second (this is a homework problem), we have M_{\lambda\mu}=0 unless \mu \leq \lambda^\prime in dominance order on \mathrm{Par}_n^d, and moreover M_{\lambda\lambda'}=1.

Now let us complete the proof of the fundamental theorem. From the above, for any \lambda such that \lambda^\prime \in \mathrm{Par}_n^d we have

e_\lambda = \sum\limits_{\mu \leq \lambda'}M_{\lambda\mu}m_\mu,

where we are suppressing the variables x_1,\dots,x_n to lighten the notation. Now let \nu \in \mathrm{Par}_n^d be such that \lambda^\prime = \nu. Since transpose of Young diagrams is an involution, this is equivalent to \lambda=\nu', and the above becomes

e_{\nu^\prime}=\sum\limits_{\mu \leq \nu}M_{\nu^\prime \mu}m_\mu.

Now in the above I am simply going to replace the letter \nu with \lambda because this helps me keep things straight in my mind (this has no mathematical content, it’s purely psychological). Making this letter replacement the above is

e_{\lambda^\prime}=\sum\limits_{\mu \leq \lambda}M_{\lambda^\prime \mu}m_\mu.

Now, the splitting of the term of the sum corresponding to \mu=\lambda we have

e_{\lambda^\prime}=m_\lambda+\sum\limits_{\mu < \lambda}M_{\lambda^\prime \mu}m_\mu

where the sum on the right hand side is now over partitions \mu which are strictly less than \lambda in dominance order on \mathrm{Par}_n^d.

First we show that \{e_{\lambda^\prime} \colon \lambda \in \mathrm{Par}_n^d\} spans \mathcal{S}_n^d. Suppose otherwise. Then, there must be some monomial symmetric polynomial m_\mu which cannot be represented as a linear combination of the elementary polynomials e_{\lambda^\prime}. Let us choose such an m_\mu with \mu minimal in the dominance order on \mathrm{Par}_n^d. Then, by our formula above,

m_\mu = e_{\mu^\prime} - \sum\limits_{\nu < \mu} M_{\mu^\prime \nu}m_\nu,

and the summation on the right does lie in the span of \{e_{\lambda^\prime} \colon \lambda \in \mathrm{Par}_n^d\}, which gives a contradiction.

Now we show that \{e_{\lambda^\prime} \colon \lambda \in \mathrm{Par}_n^d\} is linearly independent. Indeed, suppose

\sum\limits_{\lambda \in \mathrm{Par}_n^d} c_\lambda e_{\lambda^\prime}=0

where not all coefficients c_\lambda equal zero. Choose a nonzero coefficient c_\lambda with \lambda maximal in dominance order. Then, in the monomial expansion of the left hand side, the coefficient of m_\lambda is exactly c_\lambda, which forces c_\lambda=0, contradiction.

This completes the proof that \{e_\lambda^\prime \colon \lambda \in \mathrm{Par}_n^d\} is a basis of the finite-dimensional Hilbert space \mathcal{S}_n^d. Since the argument was formulated for arbitrary degree d, we conclude that

\{e_{\lambda'} \colon \lambda \in \mathrm{Par}_n\}

is a basis of the infinite-dimensional algebra \mathcal{S}_n. Finally, because e_\lambda is defined multiplicatively, this is equivalent to

\mathcal{S}_n = \mathbb{C}[e_1,\dots,e_n],

showing that the algebra \mathcal{S}_n of symmetric polynomials in variables x_1,\dots,x_n is isomorphic to the algebra \mathbb{C}[e_1,\dots,e_n] of all polynomials in variables

e_1=x_1+x_2+\dots+x_n,\ \dots,\ e_n=x_1x_2 \dots x_n.

Leave a Reply