After discussing linear transformations and matrices, we are now ready to discuss dimensions rigorously. Let be any field.
Given any ordered set of vectors, we can define the matrix
It is then not too hard to observe that
, and
.
Lemma 1. Let .
- If
is linearly independent, then
.
- If
, then
.
Consequently, if forms a basis for
, then
.
Proof. For scalars , consider the expression
If is linearly independent, then the right-hand side equalling
implies that
. Hence,
Thus, is injective, which implies that
, as required.
Suppose instead that . Then
, which means that
is surjective. Therefore,
, as required.
Lemma 2. Let . Then
has a finite basis (i.e. a linearly independent set
such that
).
Proof. If then take
to be the required basis. Suppose
. Then there exists some nonzero
.
- If
then we are done.
- Inductively, given
, either
or there exists
.
This process must end before , otherwise
will have a basis of
elements, which by Lemma 1 is not linearly independent, which is a contradiction.
Lemma 3. Let be a subspace and
and
be bases for
. Then
.
Proof. Define the isomorphisms and
by
and
. Then
is an isomorphism, and thus injective, therefore,
forms a basis of
elements for
, so that
. A symmetric argument yields
, so that
, as required.
Therefore, no matter which basis we use to define , the number of basis elements will remain the same. This number is what we define to be the dimension of a subspace of
.
Definition 1. For any linearly independent set , we define
.
Example 1. More generally, for any subspace , if
has a basis
, then
. In particular, considering the standard basis
,
.
Now working in the special case of , let’s answer a crucial question: what is the dimension of a subspace.
Theorem 1. Given subspaces ,
.
Proof. If , then a basis for
will be a basis for
, and hence
. If
, then there exists some vector
. Let
be any basis for
. Then extend this basis to a basis
of . Then
This definition certainly extends beyond .
Definition 2. Let be a vector space over
. We say that
is finite-dimensional if there exists
such that
. In this case, we write
. Otherwise,
is infinite-dimensional.
Corollary 1. Let be vector spaces over
. Suppose
is finite-dimensional. Then
if and only if
is finite-dimensional and
.
Theorem 2. Let be a finite-dimensional vector space over
with
. The following propositions hold:
has a finite basis with
elements.
- Any basis of
must have
elements.
- For any subspace
,
is finite-dimensional and
.
Proof. Let be a vector space isomorphism. Since
is bijective,
forms a basis for
. For any basis
of
, since
is bijective,
forms a basis for
, so that
.
Finally, as a subspace and thus has a basis
of
elements. Since
is bijective,
forms a basis for
with
elements. Hence, we can define the isomorphism
by
, so that
, and thus
Can we discuss infinite-dimensional vector spaces? The idea is to use our finite-dimensional intuitions to describe infinite-dimensional definitions. We observe that in the finite dimensional setting, the dimension of a subspace is the size of its basis
, i.e. if
, then
. Unfortunately, we will need more effort to formally define this notion, and delay it to the setting when it becomes necessary.
For now, let’s explore isomorphisms in more detail. When are two vector spaces isomorphic to each other? It turns out that if we have a surjective linear transformation , then we can always find some kind of isomorphism between some kind of collection of subsets of
to
. This gives us the notion of the quotient space, encoded in the first isomorphism theorem in linear algebra.
—Joel Kindiak, 1 Mar 25, 2203H
Leave a comment