Last time, we began our discussion of linear transformations, and in particular observed that the set of linear transformations from a vector space
to a vector space
is itself a vector space in a natural way, because there are natural ways to add and scale linear transformations which are compliant with the vector space axioms. In the case that
linear transformations are usually referred to as “linear operators,” and the vector space
of linear operators on
is typically denoted
This notation stems from the fact that a fancy name for a linear operator is endomorphism. This term derives from the Greek endon, meaning within, and is broadly used in mathematics to emphasize that one is considering a function whose range is contained within its domain. Linear operators are special in that, in addition to being able to scale and add them to one another, we can also multiply them in a natural way. Indeed, given two linear operators
we may define their product to be their composition, i.e. AB:=A \circ B. Spelled out, this means that
is the linear operator defined by
So is a special type of vector space whose vectors (which are operators) can be scaled, added, and multiplied. Such vector spaces warrant their own name.
Definition 1: An algebra is a vector space together with a multiplication rule
which is bilinear and associative.
Previously, the only algebra we had encountered was and now we find that there are in fact many more algebras, namely all vector spaces
for
an arbitrary vector space. So, linear operators are in some sense a generalization of numbers.
However, there are some notable differences between numerical multiplication and the multiplication of operators. One of the main differences is that multiplication of linear operators is noncommutative: it is not necessarily the case that
Exercise 1: Find an example of linear operators such that
Another key difference between the arithmetic of numbers and the arithmetic of operators is that division is only sometimes possible: it is not the case that all non-zero operators have a multiplicative inverse, which is defined as follows.
Definition 2: An operator is said to be invertible if there exists an operator
such that
where
is the identity operator defined by
for all
You should take a moment to compare this definition of invertible linear operator with the definition of a vector space isomorphism from Lecture 2. You will then that being invertible is equivalent to
being an isomorphism of
with itself. An isomorphism from a vector space to itself is called an automorphism where the prefix “auto” is from the Greek work for “self.” The set of all invertible linear operators in
is therefore often denoted
Proposition 1: If then there is precisely one operator
such that
Proof: Suppose that are such that
Then we have
— Q.E.D.
Thus, if is an invertible operator, then it has a unique inverse, so it is reasonable to call this “the inverse” of
and denote it
You should check for yourself that
is invertible, and that its inverse is
i.e. that
Exercise 2: Find an example of a nonzero linear operator which is not invertible.
Proposition 2: The set of invertible operators is closed under multiplication: if
then
Proof: We have
which shows that is invertible, and that
.
— Q.E.D.
Proposition 2 shows that the set is an example of a type of algebraic structure called a group, which roughly means a set together with a notion of multiplication in which every element has an inverse. We won’t give the precise definition of a group, since the above is the only example of a group we will see in this course. The subject of group theory is its own branch of algebra, and it has many connections to linear algebra.
All of the above may seem quite abstract, and perhaps it is. However, in the case of finite-dimensional vector spaces, linear transformations can be described very concretely as tables of numbers, i.e. as matrices. Consider the vector space of linear transformations from an
-dimensional vector space
to an
-dimensional vector space
Let
be a linear transformation, let
be a basis of
and let
be a basis of
The transformation
is then uniquely determined by the finitely many vectors
Indeed, any vector may be uniquely represented as a linear combination of vectors in
and we then have
Now, we may represent each of the vectors as a linear combination of the vectors in
and we then have
Thus, if
is the unique representation of the vector relative to the basis
of
So, our computation shows that we have the matrix equation
Schematically, this matrix equation can be expressed as follows: for any we have that
where, denotes the
matrix whose entries are the coordinates of the vector
relative to the basis
of
denotes the
matrix whose entries are the coordinates of the vector
relative to the basis
of
and
is the
matrix
whose th column is the
matrix
What this means at the conceptual level is the following: choosing a basis
in
results in a vector space isomorphism
defined by
choosing a basis in
results in a vector space isomorphism
defined by
and these two choices together result in a vector space isomorphism defined by
Let us consider how the above works in the special case that and
We are then dealing with linear operators
and the matrix representing such an operator is the square
matrix
For every we have the matrix equation
In this case, there is an extra consideration. Suppose we have two linear operators Then, we also have their product
and a natural issue is to determine the relationship between the matrices
and
Let us now work this out.
Start with a vector and let
be its representation relative to the basis of
Let
be the representations of the vectors relative to the basis
We then have
This shows that the matrix of the product transformation relative to the basis
is given by the product of the matrices representing
and
in this basis, i.e.
So, in the case of linear operators, the isomorphism given by
is not just a vector space isomorphism, but a vector space isomorphism compatible with multiplication — an algebra isomorphism.
“AB:=A \circ B” was near the top of the lecture. Was \circ supposed to be compiled using LaTeX?
LikeLike