Functions of Random Variables
Contents
Functions of Random Variables#
Today we will talk about the properties of functions of random variables. The lecture will focus on discrete random variables. However, all the results remain valid for continuous random variables.
LOTUS#
Imagine we want to study some function
What is the expectation of
If we think of
So, for example, we can say that
or
and so forth.
So what does LOTUS even stand for?
The “Law of the Unconscious Statistician” (LOTUS) because it is so often used without even thinking about it.
Now let’s use LOTUS to build up some more facts.
What is
Since
And what is
By combining the above observations we can conclude that the expected value of
Sums of Random Variables#
With LOTUS, we can now think about sums of random variables.
Linearity of Expectation#
Let
Proof.
Since
In other words, linearity of expectation says that you only need to know the
marginal distributions of
In particular, it does not matter whether
Even if
An important corollary is this: suppose we have
Then
That is, if you have
Example 1#
Use the linearity of expectation to calculate the expected value of the Binomial distribution.
Steps to Solution
Note that a Binomial is the sum of Bernoulli trials.
Determine the expected value of a Bernoulli trial.
Find expected value of Binomial by linearity of expectation of sum of Bernoulli trials.
Solution
Expected value of a Bernoulli trial is
Example 2#
Suppose two people are playing Roulette. They each first bet on red three times in a row. (Note that in Roulette, 18 of the 38 numbers are red). The first player leaves, but the second player bets two more times on red. How many more times is player 2 expected to win than player one?
Crucially, note that the number of times player 2 wins is not independent from the number of times player 1 wins, because every time player 1 wins, player 2 also wins.
Solution
Let
What is
Similarly,
How do we calculate
We just showed that the expect value of a Binomail is
Variance and Covariance#
Using linearity of expectation we can prove some useful equations for calculating variance and covariance.
Consider
It only requires computing the means of
From this fact we can also conclude that:
One of the most useful results of the linearity of expectation.
Variance of a Sum#
Let’s keep using these facts to explore how the variance of a sum works.
Consider
What is
So we see that when adding random variables, there is a correction to the variance: if the variables are positively correlated, then the variance of their sum is greater than the sum of their variances.
The amount of this correction is twice the covariance.
There’s another important consequence of this result. The covaraiance of two independent random variables is 0 so:
When adding independent random variables, variances sum.
So, consider the case where we are summing
Then the sum has mean
Variance of #
Finally, if