For random variables , denote
and define
Define the covariance matrix by
Note that .
Problem 1. For i.i.d. , define
. Calculate the p.d.f.
of
and evaluate
.
(Click for Solution)
Solution. For each , the p.d.f.
of
is
Since are independent,
Furthermore,
where denotes the
identity matrix.
Problem 2. For any invertible matrix and
, define
. Calculate the p.d.f.
of
and evaluate
.
(Click for Solution)
Solution. We first assume the case . Define the bijective and continuous map. We observe that for
,
That is, is the pushforward measure of
under
. Letting
denote the
-dimensional Lebesgue measure, we leave it as an exercise in linear algebra to verify that
By a change of variables, for any ,
By the Radon-Nikodým theorem,
so that
Furthermore,
In the marginally more general case, . We leave it as an exercise to verify that
and
, so that finally
Definition 1. A random variable is multivariate normal, denoted
, if it has a p.d.f.
given by
In this case, we say that the mean of is
, and the covariance of
is
. For example,
.
Problem 3. Fix .
- Prove that if
is diagonal whose diagonal entires are positive
, then
are independent with
.
- Prove that for any invertible
and
,
for constants
to be calculated.
(Click for Solution)
Solution. For the first point, if is diagonal with
, then
is diagonal with
, and
Some algebra then yields
Therefore, are independent with
.
For the second point, follow the proof in Problem 2 with needful bookkeeping to obtain
which simplifies to Definition 1 with and
.
Problem 4. Let . Prove that there exists an invertible matrix
such that
. In particular, each
is normally distributed.
(Click for Solution)
Solution. By definition,
We first assume . If we can find some invertible matrix
such that
, then we can set
. To that end, we note that
is symmetric since
By the spectral theorems, is orthogonally diagonalisable. That is, there exists an orthogonal matrix
(i.e.
) and a diagonal matrix
such that
Furthermore, the entries in the diagonal of are all positive, denoted
. Define
by
Then , so that
Now define . By Problem 3,
is multivariate normal. We calculate its expectation via
and its covariance matrix via
Therefore, . Since
, we conclude
, so that
, as required.
For the general case, has expectation
, and so by the first result, there exists some invertible matrix
such that
as required.
Problem 5. Define and for each
, define
. If
are i.i.d., prove that
and
are independent. Deduce that
and
defined by
are independent.
(Click for Solution)
Solution. For each ,
Furthermore, define . Defining
,
. If
are i.i.d. normally distributed with variance
, then by the converse of the first point in Problem 3 (which holds),
is multivariate normal. By the second point in Problem 3,
is multivariate normal.
We turn to evaluate its covariance matrix. Firstly, to shorten computations,
Furthermore,
Thus, for ,
Therefore, using properties involving the exponential function,
Hence, and
are independent. For the second claim, we first note that
so that . It follows that
Since can be written purely in terms of
, it is independent of
.
Problem 6. Using the definitions and notation in Problem 5, prove that for any ,
Deduce that there exist i.i.d. such that
(Click for Solution)
Solution. We first observe that
Performing algebruh,
We now prove the final equation by induction. In the case , we have
Then , so that
, as required. For the induction step, suppose that the statement holds for
. Then
By relabelling the result in Problem 5, we have that is independent of
and
. By the induction hypothesis, there exist i.i.d.
such that
Furthermore, so that
Therefore,
as required.
—Joel Kindiak, 25 Jul 25, TBCH