A random variable is neither random nor a variable. Let me explain.
Flip a fair coin twice. Let denote the number of Heads that you obtain. What is the value of
? Well, it depends on each outcome in the sample space
:
Denote and the uniform probability measure on
by
. Since
denotes the number of Heads, we have
Notice then the randomness arises in selecting the outcome , not in the measurement
(though the former inevitably influences the latter). The quantity
just records the number of Heads, and each occurs with some probability (which we will explore later on).
Thus, while we call a random variable, it is neither random (though that is captured by the randomness of
), nor a variable (though that is captured by
).
Let and
be measurable spaces.
Definition 1. A map is
–measurable if for any
,
. We omit the prefix
when the context is clear.
Lemma 1. Suppose . Any map
is measurable, where
is equipped with the
-algebra
.
Proof. For any ,
Lemma 2. Suppose furthermore there exists a probability measure on
. Then the map
is a probability measure on the measurable space
, called the push-forward measure of
.
Lemmas 1 and 2 are crucial for this purpose: they illustrate that, all things considered, the underlying sample space is not as nearly as relevant as the measures induced on
. Eventually, we will want to develop measures on
too, though that will take substantially more effort.
Equip with the
-algebra
Definition 2. Let be any probability space. A discrete random variable is a map
. Since we equipped
with the
-algebra
,
is automatically measurable. We call its pushforward measure
the distribution of
, and for convenience denote, for
,
Without loss of generality, given that a discrete random variable has distribution
, we assume that
and that with respect to this choice,
.
Lemma 3. For any discrete random variable ,
.
The support of a discrete random variable is . The probability mass function or p.m.f. of
is defined by
. The cumulative distribution function or c.d.f. of
is defined by
To illustrate meaningful examples, let’s consider the flipping of a biased coin with parameter . This means the probability of getting ‘Head’ is some fixed value
, which may or may not be
.
Example 1. Consider the usual sample space equipped with the
-algebra
. Define the probability measure
by
, which implies that
Define the random variable defined by
and
. Then
so that for
.
Definition 3. Let be any probability space and
be a discrete random variable. We say that
follows a Bernoulli distribution with parameter
, denoted
, if
Example 2. The random variable defined in Example 1 has the distribution
. The random variable
has the distribution
since
Example 3. The discrete random variable is uniform on
given integers
if
Having properly defined a discrete random variable, a natural question arises: are there approaches to create new random variables from old ones? For instance, given two random variables and
, what is the distribution of their sum
? In the special case of Example 2, it is obvious that by construction,
. But what if
don’t have any obvious connection?
Next time, we explore the idea of independence of random variables and think about ways to combine them.
—Joel Kindiak, 27 Jun 25, 1300H
Leave a comment