Introductory Probability: Random Variables and Expectation

Let us begin with an example.

Example. Consider an experiment of tossing 3 fair coins. Let $X$ denote the number of heads appearing. Then $X$ takes on one of the values 0, 1, 2, 3 with respective probabilities: \begin{align*}P_r\{X=0\}&=P_r\{(T,T,T)\}=\frac{1}{8}\\P_r\{X=1\}&=P_r\{(T,T,H),(T,H,T),(H,T,T)\}=\frac{3}{8}\\P_r\{X=2\}&=P_r\{(T,H,H),(H,T,H),(H,H,T)\}=\frac{3}{8}\\P_r\{X=3\}&=P_r\{(H,H,H)\}=\frac{1}{8}\end{align*} This $X$ here is an example of what is called a random variable. Random variables are real-valued functions defined on the sample space $S$ i.e. $X: S\longrightarrow\mathbb{R}$. In this example, $X$ is the function $X: S\longrightarrow\{0,1,2,3\}$ with $S=\{(T,T,T),(T,T,H),(T,H,T),(H,T,T),(T,H,H),(H,T,H),(H,H,T),(H,H,H)\}$ defined by the number of heads appearing for each outcome. In probability, often we are more interested in the values of a random variable rather than the outcomes of an experiment.

Remark. In more traditional mathematical notation, $P_r\{X=i\}$ is denoted by $P_r\{X^{-1}(i)\}$ which means $$P_r\{X^{-1}(i)\}=P_r\{s\in S: X(s)=i\}$$ But in probability, $P_r\{X=i\}$ is commonly used notation.

Here is another example.

Example. Three marbles are randomly selected from a jar containing 3 white, 3 red, and 5 black marbles. Suppose that we win \$1 for each white marble selected and lose \$1 for each red marble selected. Let $X$ denote the total winnings from the experiment. Then $X$ is a random variable taking on the values $0,\pm 1,\pm 2,\pm3$ with respective probabilities: \begin{align*}P_r\{X=0\}&=\frac{\begin{pmatrix}5\\3\end{pmatrix}+\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}5\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{55}{165}\\P_r\{X=1\}=P_r\{X=-1\}&=\frac{\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}5\\2\end{pmatrix}+\begin{pmatrix}3\\2\end{pmatrix}\begin{pmatrix}3\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{39}{165}\\P_r\{X=2\}=P_r\{X=-2\}&=\frac{\begin{pmatrix}3\\2\end{pmatrix}\begin{pmatrix}5\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{15}{165}\\P_r\{X=3\}=P_r\{X=-3\}&=\frac{\begin{pmatrix}3\\3\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{1}{165}\end{align*} The probability that we win money is $$\sum_{i=1}^3P_r\{X=i\}=\frac{55}{165}=\frac{1}{3}$$

Definition. Let $\mathrm{PMF}_X(i):=P_r\{X=i\}$. $\mathrm{PMF}_X(i)$ is called the probability mass function for random variable $X$.

Definition. Let $\mathrm{CDF}_X(i):=P_r\{X\leq i\}$. $\mathrm{CDF}_X(i)$ is called the cumulative distribution function and it describes the probability that the value of a random variable is below a specified number.

$\mathrm{CDF}_X(i)$ can be expressed as $$\mathrm{CDF}_X(i)=\sum_{j\leq i}\mathrm{PMF}_X(j)$$ i.e. it is the accumulation of distribution (probability) described by the probability mass function for values up to $i$, hence the name the cumulative distribution function.

Example. How likely is it that it will take no more than 3 flips for a coin to land on heads?

Solution. Let $X$ denote the number of heads appearing. Then what the question is asking is $P_r\{X\leq 3\}$ i.e. $\mathrm{CDF}_X(3)$. \begin{align*}\mathrm{CDF}_X(3)&=\sum_{j=1}^3\mathrm{PMF}_X(j)\\&=\sum_{j=1}^3 P_r\{X=j\}\\&=\frac{1}{2}+\frac{1}{4}+\frac{1}{8}=\frac{7}{8}\end{align*}

Definition. The expected value of a random variable $X$, denoted by $E(X)$, is the weighted average of its possible values weighted according to their probabilities. More specifically, $$E(X)=\sum_i\mathrm{PMF}_X(i)\cdot i=\sum_i P_r\{X=i\}\cdot i$$ The expected value is also called the expectation or the mean.

Example. What is the expected value of a roll of a die?

Solution. The random variable $X$ takes on values 1, 2, 3, 4, 5, 6 and each of these values has equal proprobability of $\frac{1}{6}$. Hence, the expected value of $X$ is $$E(X)=\sum_{i=1}^6\frac{1}{6}i=\frac{1}{6}\frac{6\cdot 7}{2}=3.5$$ Here I used the formula $$1+2+3+\cdots+n=\frac{n(n+1)}{2}$$ to calculate $1+2+3+4+5+6$.

Example. If a die is rolled three times, how many distinct values are expected to appear?

Solution. Let $X$ denote the number of distinct values. There are $6^3$ outcomes of this experiment. Calculating $P_r\{X=1\}$ and $P_r\{X=3\}$ are straightforward. \begin{align*}P_r\{X=1\}&=\frac{\begin{pmatrix}6\\1\end{pmatrix}}{6^3}=\frac{1}{36}\\P_r\{X=3\}&=\frac{\begin{pmatrix}6\\3\end{pmatrix}}{6^3}=\frac{5}{54}\end{align*} $P_r\{X=2\}$ can be found by $$P_r\{X=2\}=1-P_r\{X=1\}-P_r\{X=3\}=1-\frac{1}{36}-\frac{5}{54}=\frac{95}{108}$$ The expected value is then \begin{align*}E(X)&=P_r\{X=1\}\cdot 1+P_r\{X=2\}\cdot 2+P_r\{X=3\}\cdot 3\\&=\frac{1\cdot 1}{36}+\frac{95\cdot 2}{108}+\frac{5\cdot 3}{54}=\frac{223}{108}\approx 2.06\end{align*}

Example. Let $X$ denote the number of heads appearing in a sequence of 10 flips of a coin. What is $E(X)$?

Solution. For any $i=0,1,\cdots,10$, there are $\begin{pmatrix}10\\i\end{pmatrix}$ sequences of 10 flips, each of which contains $i$ heads. Each such sequence happens with probability $\left(\frac{1}{2}\right)^{10}$. Hence, \begin{align*}E(X)&=\sum_{i=0}^{10}P_r\{X=i\}\cdot i\\&=\sum_{i=1}^{10}P_r\{X=i\}\cdot i\\&=\sum_{i=1}^{10}\begin{pmatrix}10\\i\end{pmatrix}\left(\frac{1}{2}\right)^{10}\cdot i\\&=\left(\frac{1}{2}\right)^{10}\cdot 10\cdot 2^9\\&=5\end{align*} For the second line to the last, I used the identity \begin{equation}\label{eq:binom2}\sum_{k=1}^n\begin{pmatrix}n\\k\end{pmatrix}k=n2^{n-1}\end{equation} The equation \eqref{eq:binom2} can be easily seen. \begin{align*}\sum_{k=1}^n\begin{pmatrix}n\\k\end{pmatrix}k&=\sum_{k=1}^n\frac{n!}{(k-1)!(n-k)!}\\&=n\sum_{k=1}^n\frac{(n-1)!}{(k-1)!(n-1-(k-1))!}\\&=n\sum_{k=1}^n\begin{pmatrix}n-1\\k-1\end{pmatrix}\\&=n2^{n-1}\end{align*} The last expression is obtain using the equation (4) here.

While the expected value $E(X)$ provides useful information as the weighted average of the possible values of $X$. But it does not provide information on the spread of these values. The variance provides such information.

Definition. If $X$ is a random variable with expected value $E(X)$, then the variance of $X$, denoted by $\mathrm{Var}(X)$, is defined by $$\mathrm{Var}(X)=E[(X-E(X))^2]$$

To simplify our calculation, let us denote $E(X)$ by $\mu$. Then \begin{align*}\mathrm{Var}(X)&=E[(x-\mu)^2]\\&=\sum_iP_r\{X=i\}(i-\mu)^2\end{align*} This last expression shows that $\mathrm{Var}(X)$ measures how far apart $X$ would be from its expected value on the average. Let us continue further from the last expression above. \begin{align*}\sum_iP_r\{X=i\}(i-\mu)^2&=\sum_iP_r\{X=i\}(i^2-2\mu i+\mu^2)\\&=\sum_iP_r\{X=i\}i^2-2\mu\sum_iP_r\{X=i\}i+\mu^2\sum_iP_r\{X=i\}\\&=E(X^2)-2\mu^2+\mu^2\\&=E(X^2)-\mu^2\\&=E(X^2)-[E(X)]^2\end{align*} So we have an alternative formula for the variance \begin{equation}\label{eq:variance}\mathrm{Var}(X)=E(X^2)-[E(X)]^2\end{equation}

Example. Calculate $\mathrm{Var}(X)$ if $X$ represents the outcome when a die is rolled.

Solution. First \begin{align*}E(X)&=1\cdot\frac{1}{6}+2\cdot\frac{1}{6}+3\cdot\frac{1}{6}+4\cdot\frac{1}{6}+5\cdot\frac{1}{6}+6\cdot\frac{1}{6}\\&=\frac{1}{6}\frac{6\cdot 7}{2}=\frac{7}{2}\end{align*} Next, \begin{align*}E(X^2)&=1^2\cdot\frac{1}{6}+2^2\cdot\frac{1}{6}+3^2\cdot\frac{1}{6}+4^2\cdot\frac{1}{6}+5^2\cdot\frac{1}{6}+6^2\cdot\frac{1}{6}\\&=\frac{1}{6}\frac{6\cdot 7\cdot 13}{6}=\frac{91}{6}\end{align*} The value in the second line to the last is obtained by the formula $$1^2+2^2+3^2+\cdots+n^2=\frac{n(n+1)(2n+1)}{6}$$ Therefore, $$\mathrm{Var}(X)=\frac{91}{6}-\left(\frac{7}{2}\right)^2=\frac{35}{12}$$

Remarks.

  1. In mechanics, the center of gravity of a system of particles is indeed the expected value of the position coordinates of particles. Also the moment of inertia of a body is the variance of the position coordinates of particles that constitute the body.
  2. $\mathrm{SD}(X)=\sqrt{\mathrm{Var}(X)}$ is called the standard deviation of $X$.

I will complete this note with the following useful identities.

Theorem. Let $a$ and $b$ be constant. Then

  1. $E(aX+b)=aE(X)+b$. As a special case, if $b=0$, we obatin $E(aX)=aE(X)$.
  2. $\mathrm{Var}(aX+b)=a^2\mathrm{Var}(X)$.

Proof.

  1. \begin{align*}E(aX+b)&=\sum_iP_r\{X=i\}(ai+b)\\&=a\sum_iP_r\{X=i\}i+b\sum_iP_r\{X=i\}\\&=aE(X)+b\end{align*}
  2. For simplicity, let $\mu=E(X)$. Then by the theorem 1 above, $E(aX+b)=a\mu+b$. Now, \begin{align*}\mathrm{Var}(aX+b)&=E[(aX+b-a\mu-b)^2]\\&=E[a^2(X-\mu)^2]\\&=a^2E[(X-\mu)^2]\\&=a^2\mathrm{Var}(X)\end{align*}

References.

[1] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019

[2] A First Course in Probability, Sheldon Ross, 5th Edition, Prentice-Hall, 1998

Leave a Reply

Your email address will not be published. Required fields are marked *