Let us now use the symmetric and antisymmetric tensor products to define two subspaces of the tensor square which store “unevaluated” symmetric and antisymmetric tensor products of vectors from The symmetric square of is the subspace of spanned by all symmetric tensor products
Elements of are called symmetric tensors. Similarly, the antisymmetric square of is the subspace of spanned by all antisymmetric tensor products,
Elements of are called antisymmetric tensors.
All of what we have said above can be generalized in a natural way to products of more than two vectors. More precisely, for any natural number we can define the th tensor power of the vector space to be the new vector space spanned by all “unevaluated” products
of vectors The only feature of such multiple unevaluated products is that they are “multilinear,” which really just means that they behave like ordinary products (sans commutativity). For example, in the case this just means that we have the following three identities in the vector space : for any scalars
for all and
for all and
for all If comes with a scalar product we can use this to define a scalar product on in a very simple way by declaring
Even better, we can use the scalar product so defined to construct an orthonormal basis of from a given orthonormal basis such a basis is simply given by all tensor products with factors such that each factor is a vector in More precisely, these are the tensors
where is a fun notation for the set of all functions
In particular, since the cardinality of is (make choices times), the dimension of the vector space is
Example 1: If is a -dimensional vector space with orthonormal basis then an orthonormal basis of is given by the tensors
We now define the -fold symmetric and antisymmetric tensor products. These products rely on the concept of permutations.
Reading Assignment: Familiarize yourself with permutations. What is important for our purposes is that you understand how to multiply permutations, and that you understand what the sign of a permutation is. Feel free to ask questions as needed.
Definition 1: For any and any vectors we define the symmetric tensor product of these vectors by
and denote by the subspace of spanned by all symmetric tensor products of vectors from Likewise, we define the antisymmetric tensor product of by
and denote by the subspace of spanned by all antisymmetric tensor products of vectors from
Note that, in the case this definition coincides with the definitions
from Lecture 21.
Since the symmetric and antisymmetric tensor products are defined in terms of the tensor product, they inherit multilinearity. For example, in the case this means that we have the following three identities in the vector space : for any scalars
for all and
for all and
for all The analogous statements hold in
The symmetric tensor product is constructed in such a way that
for any permutation whereas the antisymmetric tensor product is constructed in such a way that
for any permutation In particular, if any two of the vectors are equal, then
Indeed, suppose that On one hand, by the above antisymmetry we have
but on the other hand we also have
because This means that
if which forces
The vector space is called the th symmetric power of and its elements are called symmetric tensors of degree The vector space is called the th antisymmetric power of and its elements are called antisymmetric tensors of degree These vector spaces have a physical interpretation. In quantum mechanics, an -dimensional vector space is viewed as the state space of a particle that can be in any one of quantum states. The space is then the state space of bosons, each of which may occupy one of quantum states, while is the state space of fermions, each of which may be in any of quantum states. The vanishing of wedge products with two equal factors corresponds physically to the characteristic feature of fermions, i.e. the Pauli exclusion principle. You don’t have to know any of this — I included this perspective in order to provide some indication that the construction of these vector spaces is not just abstract nonsense.
Theorem 1: For any and any we have
Since we won’t use this theorem much, we will skip the proof. However, the proof is not too difficult, and is an exercise in permutations: simply plug in the definitions of the symmetric and antisymmetric tensor products in terms of the original tensor products, expand the scalar product, and simplify.
Perhaps counterintuitively, the antisymmetric tensor product is more important than the symmetric tensor product in linear algebra. The next theorem explains why.
Theorem 2: For any and any the set is linearly dependent if and only if
Proof: Suppose first that is a linearly dependent set of vectors in If this means that whence
If then without loss in generality, the vector is a linear combination of the vectors
We then have that
by multinearity of the wedge product. Now observe that the th term in the sum is a scalar multiple of the wedge product
which contains the vector twice, and hence each term in the sum is the zero tensor.
Conversely, suppose are vectors such that
We must prove that is a linearly dependent set in . We will prove the (equivalent) contrapositive statement: if is a linearly independent set in then
We prove this by induction on In the case we have that is linearly independent, so
For the inductive step, we proceed as follows. Since is a linearly independent set, it is a basis of the subspace
Let denote the scalar product on defined by declaring this basis to be orthonormal. We now define a linear transformation
We then have that
Now, since is a linearly independent set, so is the subset Thus, by the induction hypothesis,
It then follows that
since otherwise the linear transformation would map the zero vector in to a nonzero vector in which is impossible.
Corollary 1: We have with equality if and only if is linearly dependent.
you can think of this as a massive generalization of the Cauchy-Schwarz inequality, which is the case