Introductory Probability: Random Variables and Expectation

Let us begin with an example.

Example. Consider an experiment of tossing 3 fair coins. Let $X$ denote the number of heads appearing. Then $X$ takes on one of the values 0, 1, 2, 3 with respective probabilities: \begin{align*}P_r\{X=0\}&=P_r\{(T,T,T)\}=\frac{1}{8}\\P_r\{X=1\}&=P_r\{(T,T,H),(T,H,T),(H,T,T)\}=\frac{3}{8}\\P_r\{X=2\}&=P_r\{(T,H,H),(H,T,H),(H,H,T)\}=\frac{3}{8}\\P_r\{X=3\}&=P_r\{(H,H,H)\}=\frac{1}{8}\end{align*} This $X$ here is an example of what is called a random variable. Random variables are real-valued functions defined on the sample space $S$ i.e. $X: S\longrightarrow\mathbb{R}$. In this example, $X$ is the function $X: S\longrightarrow\{0,1,2,3\}$ with $S=\{(T,T,T),(T,T,H),(T,H,T),(H,T,T),(T,H,H),(H,T,H),(H,H,T),(H,H,H)\}$ defined by the number of heads appearing for each outcome. In probability, often we are more interested in the values of a random variable rather than the outcomes of an experiment.

Remark. In more traditional mathematical notation, $P_r\{X=i\}$ is denoted by $P_r\{X^{-1}(i)\}$ which means $$P_r\{X^{-1}(i)\}=P_r\{s\in S: X(s)=i\}$$ But in probability, $P_r\{X=i\}$ is commonly used notation.

Here is another example.

Example. Three marbles are randomly selected from a jar containing 3 white, 3 red, and 5 black marbles. Suppose that we win \$1 for each white marble selected and lose \$1 for each red marble selected. Let $X$ denote the total winnings from the experiment. Then $X$ is a random variable taking on the values $0,\pm 1,\pm 2,\pm3$ with respective probabilities: \begin{align*}P_r\{X=0\}&=\frac{\begin{pmatrix}5\\3\end{pmatrix}+\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}5\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{55}{165}\\P_r\{X=1\}=P_r\{X=-1\}&=\frac{\begin{pmatrix}3\\1\end{pmatrix}\begin{pmatrix}5\\2\end{pmatrix}+\begin{pmatrix}3\\2\end{pmatrix}\begin{pmatrix}3\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{39}{165}\\P_r\{X=2\}=P_r\{X=-2\}&=\frac{\begin{pmatrix}3\\2\end{pmatrix}\begin{pmatrix}5\\1\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{15}{165}\\P_r\{X=3\}=P_r\{X=-3\}&=\frac{\begin{pmatrix}3\\3\end{pmatrix}}{\begin{pmatrix}11\\3\end{pmatrix}}=\frac{1}{165}\end{align*} The probability that we win money is $$\sum_{i=1}^3P_r\{X=i\}=\frac{55}{165}=\frac{1}{3}$$

Definition. Let $\mathrm{PMF}_X(i):=P_r\{X=i\}$. $\mathrm{PMF}_X(i)$ is called the probability mass function for random variable $X$.

Definition. Let $\mathrm{CDF}_X(i):=P_r\{X\leq i\}$. $\mathrm{CDF}_X(i)$ is called the cumulative distribution function and it describes the probability that the value of a random variable is below a specified number.

$\mathrm{CDF}_X(i)$ can be expressed as $$\mathrm{CDF}_X(i)=\sum_{j\leq i}\mathrm{PMF}_X(j)$$ i.e. it is the accumulation of distribution (probability) described by the probability mass function for values up to $i$, hence the name the cumulative distribution function.

Example. How likely is it that it will take no more than 3 flips for a coin to land on heads?

Solution. Let $X$ denote the number of heads appearing. Then what the question is asking is $P_r\{X\leq 3\}$ i.e. $\mathrm{CDF}_X(3)$. \begin{align*}\mathrm{CDF}_X(3)&=\sum_{j=1}^3\mathrm{PMF}_X(j)\\&=\sum_{j=1}^3 P_r\{X=j\}\\&=\frac{1}{2}+\frac{1}{4}+\frac{1}{8}=\frac{7}{8}\end{align*}

Definition. The expected value of a random variable $X$, denoted by $E(X)$, is the weighted average of its possible values weighted according to their probabilities. More specifically, $$E(X)=\sum_i\mathrm{PMF}_X(i)\cdot i=\sum_i P_r\{X=i\}\cdot i$$ The expected value is also called the expectation or the mean.

Example. What is the expected value of a roll of a die?

Solution. The random variable $X$ takes on values 1, 2, 3, 4, 5, 6 and each of these values has equal proprobability of $\frac{1}{6}$. Hence, the expected value of $X$ is $$E(X)=\sum_{i=1}^6\frac{1}{6}i=\frac{1}{6}\frac{6\cdot 7}{2}=3.5$$ Here I used the formula $$1+2+3+\cdots+n=\frac{n(n+1)}{2}$$ to calculate $1+2+3+4+5+6$.

Example. If a die is rolled three times, how many distinct values are expected to appear?

Solution. Let $X$ denote the number of distinct values. There are $6^3$ outcomes of this experiment. Calculating $P_r\{X=1\}$ and $P_r\{X=3\}$ are straightforward. \begin{align*}P_r\{X=1\}&=\frac{\begin{pmatrix}6\\1\end{pmatrix}}{6^3}=\frac{1}{36}\\P_r\{X=3\}&=\frac{\begin{pmatrix}6\\3\end{pmatrix}}{6^3}=\frac{5}{54}\end{align*} $P_r\{X=2\}$ can be found by $$P_r\{X=2\}=1-P_r\{X=1\}-P_r\{X=3\}=1-\frac{1}{36}-\frac{5}{54}=\frac{95}{108}$$ The expected value is then \begin{align*}E(X)&=P_r\{X=1\}\cdot 1+P_r\{X=2\}\cdot 2+P_r\{X=3\}\cdot 3\\&=\frac{1\cdot 1}{36}+\frac{95\cdot 2}{108}+\frac{5\cdot 3}{54}=\frac{223}{108}\approx 2.06\end{align*}

Example. Let $X$ denote the number of heads appearing in a sequence of 10 flips of a coin. What is $E(X)$?

Solution. For any $i=0,1,\cdots,10$, there are $\begin{pmatrix}10\\i\end{pmatrix}$ sequences of 10 flips, each of which contains $i$ heads. Each such sequence happens with probability $\left(\frac{1}{2}\right)^{10}$. Hence, \begin{align*}E(X)&=\sum_{i=0}^{10}P_r\{X=i\}\cdot i\\&=\sum_{i=1}^{10}P_r\{X=i\}\cdot i\\&=\sum_{i=1}^{10}\begin{pmatrix}10\\i\end{pmatrix}\left(\frac{1}{2}\right)^{10}\cdot i\\&=\left(\frac{1}{2}\right)^{10}\cdot 10\cdot 2^9\\&=5\end{align*} For the second line to the last, I used the identity \begin{equation}\label{eq:binom2}\sum_{k=1}^n\begin{pmatrix}n\\k\end{pmatrix}k=n2^{n-1}\end{equation} The equation \eqref{eq:binom2} can be easily seen. \begin{align*}\sum_{k=1}^n\begin{pmatrix}n\\k\end{pmatrix}k&=\sum_{k=1}^n\frac{n!}{(k-1)!(n-k)!}\\&=n\sum_{k=1}^n\frac{(n-1)!}{(k-1)!(n-1-(k-1))!}\\&=n\sum_{k=1}^n\begin{pmatrix}n-1\\k-1\end{pmatrix}\\&=n2^{n-1}\end{align*} The last expression is obtain using the equation (4) here.

While the expected value $E(X)$ provides useful information as the weighted average of the possible values of $X$. But it does not provide information on the spread of these values. The variance provides such information.

Definition. If $X$ is a random variable with expected value $E(X)$, then the variance of $X$, denoted by $\mathrm{Var}(X)$, is defined by $$\mathrm{Var}(X)=E[(X-E(X))^2]$$

To simplify our calculation, let us denote $E(X)$ by $\mu$. Then \begin{align*}\mathrm{Var}(X)&=E[(x-\mu)^2]\\&=\sum_iP_r\{X=i\}(i-\mu)^2\end{align*} This last expression shows that $\mathrm{Var}(X)$ measures how far apart $X$ would be from its expected value on the average. Let us continue further from the last expression above. \begin{align*}\sum_iP_r\{X=i\}(i-\mu)^2&=\sum_iP_r\{X=i\}(i^2-2\mu i+\mu^2)\\&=\sum_iP_r\{X=i\}i^2-2\mu\sum_iP_r\{X=i\}i+\mu^2\sum_iP_r\{X=i\}\\&=E(X^2)-2\mu^2+\mu^2\\&=E(X^2)-\mu^2\\&=E(X^2)-[E(X)]^2\end{align*} So we have an alternative formula for the variance \begin{equation}\label{eq:variance}\mathrm{Var}(X)=E(X^2)-[E(X)]^2\end{equation}

Example. Calculate $\mathrm{Var}(X)$ if $X$ represents the outcome when a die is rolled.

Solution. First \begin{align*}E(X)&=1\cdot\frac{1}{6}+2\cdot\frac{1}{6}+3\cdot\frac{1}{6}+4\cdot\frac{1}{6}+5\cdot\frac{1}{6}+6\cdot\frac{1}{6}\\&=\frac{1}{6}\frac{6\cdot 7}{2}=\frac{7}{2}\end{align*} Next, \begin{align*}E(X^2)&=1^2\cdot\frac{1}{6}+2^2\cdot\frac{1}{6}+3^2\cdot\frac{1}{6}+4^2\cdot\frac{1}{6}+5^2\cdot\frac{1}{6}+6^2\cdot\frac{1}{6}\\&=\frac{1}{6}\frac{6\cdot 7\cdot 13}{6}=\frac{91}{6}\end{align*} The value in the second line to the last is obtained by the formula $$1^2+2^2+3^2+\cdots+n^2=\frac{n(n+1)(2n+1)}{6}$$ Therefore, $$\mathrm{Var}(X)=\frac{91}{6}-\left(\frac{7}{2}\right)^2=\frac{35}{12}$$

Remarks.

  1. In mechanics, the center of gravity of a system of particles is indeed the expected value of the position coordinates of particles. Also the moment of inertia of a body is the variance of the position coordinates of particles that constitute the body.
  2. $\mathrm{SD}(X)=\sqrt{\mathrm{Var}(X)}$ is called the standard deviation of $X$.

I will complete this note with the following useful identities.

Theorem. Let $a$ and $b$ be constant. Then

  1. $E(aX+b)=aE(X)+b$. As a special case, if $b=0$, we obatin $E(aX)=aE(X)$.
  2. $\mathrm{Var}(aX+b)=a^2\mathrm{Var}(X)$.

Proof.

  1. \begin{align*}E(aX+b)&=\sum_iP_r\{X=i\}(ai+b)\\&=a\sum_iP_r\{X=i\}i+b\sum_iP_r\{X=i\}\\&=aE(X)+b\end{align*}
  2. For simplicity, let $\mu=E(X)$. Then by the theorem 1 above, $E(aX+b)=a\mu+b$. Now, \begin{align*}\mathrm{Var}(aX+b)&=E[(aX+b-a\mu-b)^2]\\&=E[a^2(X-\mu)^2]\\&=a^2E[(X-\mu)^2]\\&=a^2\mathrm{Var}(X)\end{align*}

References.

[1] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019

[2] A First Course in Probability, Sheldon Ross, 5th Edition, Prentice-Hall, 1998

Introductory Probability: Baye’s Theorem

Let $S$ be sample space and $E, F$ events. The event $E$ can be written as \begin{align*}E&=E\cap S\\&=E\cap(F\dot\cup F^c)\\&=(E\cap F)\dot\cup(E\cap F^c)\end{align*} By axiom 3 of finite probability, we have \begin{equation}\begin{aligned}P_r(E)&=P_r(E\cap F)+P_r(E\cap F^c)\\&=P_r(E|F)P(F)+P_r(E|F^c)P(F^c)\\&=P_r(E|F)P_r(F)+P_r(E|F^c)(1-P_r(F))\end{aligned}\label{eq:baye}\end{equation} This equation states that the probability of the event $E$ is a weighted average of the conditional probability of $E$ given that $F$ has happened and the conditional probability of $E$ given that $F$ has not occurred. The equation \eqref{eq:baye} is useful because often it is difficult to calculate the probability of the even $E$ directly but knowing the information on whether the other event $F$ has happened helps us to determine the probability of $E$.

Example. An insurance company divides people into two categories: those who are accident prone and those who are not. A statistics shows that an accident-prone person will have an accident at some time within a fixed 1-year period with probability 0.4. This probability decreases to 0.2 for a non-accident-prone person. If 30% of the population is accident prone, what is the probability that a new policyholder will have an accident within a year of purchasing a policy?

Solution. Let $E$ denote the event that the policyholder will have an accident within a year of purchase. Let $F$ denote the event that the policyholder is accident prone. Using the equation \eqref{eq:baye}, \begin{align*}P_r(E)&=P_r(E|F)P_r(F)+P_r(E|F^c)P_r(F^c)\\&=0.4\times 0.3+0.2\times 0.7\\&=0.26\end{align*}

Suppose that $P_r(E)$ and $P_r(F)$ are both nonzero. Then it follow from the conditional probabilities $$P_r(E|F)=\frac{P_r(E\cap F)}{P_r(F)},\ P_r(F|E)=\frac{P_r(F\cap E)}{P_r(E)}$$ that \begin{equation}\label{eq:baye2}P_r(E|F)=\frac{P_r(F|E)P_r(E)}{P_r(F)}\end{equation} The equation \eqref{eq:baye2} is usually called Baye’s Theorem, named after an English statistician and a philosopher Reverend Thomas Bayes (pronounced ‘beiz’). If we regard the event $E$ as a hypothesis and $F$ as an evidence, the probabilities $P_r(E)$ and $P_r(E|F)$ can be interpreted, respectively, as the initial degree of belief in $E$ and the degree of belief in $E$ after having accounted the evidence $F$. The factor $\frac{P_r(F|E)}{P_r(F)}$ can then be interpreted as the support $F$ provides for $E$.

Example. This is the second part of the previous example. Suppose that a new policyholder has an accident within a year of purchasing a policy. What is the probability that he or she is accident prone?

Solution. What the question is asking is $P_r(F|E)$. By Baye’s theorem \eqref{eq:baye2}, \begin{align*}P_r(F|E)&=\frac{P_r(E|F)P_r(F)}{P_r(E)}\\&=\frac{0.4\times 0.3}{0.26}=\frac{6}{13}\end{align*} i.e. 6 out of 13 who have an accident within a year of purchasing a policy are accident-prone people.

Example. A lab blood test is 95% effective in detecting a certain disease when it is present. The test also yields a false positive result for 1% of the healthy people tested. If 0.5% of the population actually has the disease, what is the probability a person has the disease given that the test result is positive.

Solution. Let $D$ be the event that the tested person has the disease and $E$ the event that the test result is positive. What is asked is to find $P_r(D|E)$. The available information is $P_r(E|D)=0.95$, $P_r(D)=0.005$, and $P_r(E|D^c)=0.01$. Using Baye’s theorem \eqref{eq:baye2} along with \eqref{eq:baye}, \begin{align*}P_r(D|E)&=\frac{P_r(E|D)P_r(D)}{P_r(E|D)P_r(D)+P_r(E|D^c)P_r(D^c)}\\&=\frac{0.95\times 0.005}{0.95\times 0.005+0.01\times 0.995}\\&=\frac{95}{294}\approx 0.323\end{align*} i.e. only 32% of those who tested positive actually have the disease.

Example. During a criminal investigation, the detective in charge is 60% convinced that a suspect is guilty. Now a new piece of evidence comes into light and it shows that the criminal has a certain characteristic (such as left-handedness, baldness, or brown hair). Suppose that 20% of the population possesses this characteristic. It turns out that the suspect does have this characteristic, how certain is the detective now that the suspect is guilty of the crime?

Solution. Let $G$ be the event that the suspect is guilty and $C$ the event that he possesses the characteristic of the criminal. What is asked is to find $P_r(G|C)$. The available information is then $P_r(G)=0.6$, $P_r(C|G^c)=0.2$, and $P_r(C|G)=1$ (The real criminal does have the characteristic.) Using Baye’s theorem \eqref{eq:baye2} along with \eqref{eq:baye}, \begin{align*}P_r(G|C)&=\frac{P_r(C|G)P_r(G)}{P_r(C|G)P_r(G)+P_r(C|G^c)P(G^c)}\\&=\frac{1\times 0.6}{1\times 0.6+0.2\times 0.4}\\&\approx 0.882\end{align*}

References.

[1] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019

[2] A First Course in Probability, Sheldon Ross, 5th Edition, Prentice-Hall, 1998

Introductory Probability: Conditional Probability

A conditional probability is the probability that an event will occur on the condition that some other event occurs. For example, the odds of rolling two 1s with two dice are $\frac{1}{36}$. Rolls of two dice are independent events, but if we know the roll of at least one of the dice is an odd number, then the likelihood of two 1s increases.

Definition. Let $A$ and $B$ be events. Suppose $P_r(B)>0$. Then the conditional probability of $A$ given $B$ is \begin{equation}\label{eq:condprob}P_r(A|B)=\frac{P_r(A\cap B)}{P_r(B)}\end{equation} If $P_r(B)=1$, then knowing the event $B$ occurs gives no information on event $A$ and $P_r(A|B)=P_r(A)$. If $P_r(B)<1$, then knowing event $B$ happened increases the probability of event $A$ happened by $\frac{1}{P_r(B)}$.

As a special case, if all outcomes are equally likely, then \begin{align*}P_r(A|B)&=\frac{P_r(A\cap B)}{P_r(B)}\\&=\frac{\frac{|A\cap B|}{|S|}}{\frac{B|}{|S|}}\\&=\frac{|A\cap B|}{|B|}\end{align*}

Example. (Rolling two 1s with two dice) The sample space is $\{1,2,3,4,5,6\}\times\{1,2,3,4,5,6\}$ and $A=\{(1,1)\}$. Let $B$ be the event that one of the dice has come up with an odd number. Then $$B=\{1,3,5\}\times\{1,2,3,4,5,6\}\cup\{1,2,3,4,5,6\}\times\{1,3,5\}$$ $|B|=27$ and $A\cap B=A$, so we obtain $$P_r(A|B)=\frac{1}{27}>\frac{1}{36}$$

Example. If a die is rolled twice, what is the probability that the sum of rolls is at least 9? If the first roll is a 4, does the probability increase, decrease, or stay the same?

Solution. Let $A$ be the event that the rolls total at least 9. Then $$A=\{(3,6),(4,5),(4,6),(5,4),(5,5),(5,6),(6,3),(6,4),(6,5),(6,6)\}$$ and $P_r(A)=\frac{|A|}{|S|}=\frac{10}{36}=\frac{5}{18}$. Let $B$ be the event that the first roll is 4. Then $$B=\{(4,1),(4,2),(4,3),(4,4),(4,5),(4,6)\}$$ and $A\cap B=\{(4,5),(4,6)\}$. Thus, $$P_r(A|B)=\frac{|A\cap B|}{|B|}=\frac{2}{6}=\frac{1}{3}>\frac{5}{18}=P_r(A)$$

Example. A coin is flipped twice. What is the conditional probability that both flips result in heads, given that the first flip does?

Solution. Let $A$ be the event that both flips land head and $B$ the event that the first flip lands head. Then $$A=\{(H,H)\},\ B=\{(H,H),(H,T)\}$$ and so $P_r(A|B)=\frac{1}{2}$.

Example. An urn contains 10 white, 5 yellow, and 10 black marbles. A marble is chosen at random from the urn and it is noted it is not one of the black marbles. What is the probability that it is yellow?

Solution. Let $A$ be the event that the marble selected is yellow and $B$ the event that the marble selected is not black. $A\cap B=A$ so $P_r(A\cap B)=\frac{5}{25}$. Since $P_r(B)=\frac{15}{25}$, \begin{align*}P_r(A|B)&=\frac{P_r(A\cap B)}{P_r(B)}\\&=\frac{5}{15}\\&=\frac{1}{3}\end{align*}

Remark. In the above example, since we know that the chosen marble is not black, we can reduced our sample space by including only white and yellow marbles. Then the probability that a marble chosen at random is yellow is simply $P_r(A)=\frac{5}{15}=\frac{1}{3}$. When all outcomes are assumed to be equally likely, it is often easier to compute a conditional probability by reducing sample space instead of directly applying the formula \eqref{eq:condprob}.

Remark. (Independent Events and Conditional Probability) Recall that two events $A$ and $B$ are independent if and only if $P_r(A\cap B)=P_r(A)P_r(B)$. One can easily show that two events $A$ and $B$ with nonzero probabilities are independent if and only if $$P_r(A|B)=P_r(A)$$

References.

[1] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019

[2] A First Course in Probability, Sheldon Ross, 5th Edition, Prentice-Hall, 1998

Introductory Probability: Outcomes, Events, Probability, Independence

Probability answers the following question. How frequently should we expect one outcome or another to occur if we repeat some procedure many times? First let’s introduce some terminologies that are frequently used in studying probability:

  • Trial: a repeatable procedure such as tossing a coin or drawing a marble from a jar
  • Experiment: a sequence of individual trials
  • Sample space (of a trial): the set of all possible outcomes
  • An event: any set of outcomes, so it is a subset of sample space

Example. Rolling a die is a trial. It’s sample space is the set $\{1,2,3,4,5,6\}$. The subset $\{2,4,6\}$ is an event.

A sample space in general can be an infinite set but we consider only finite sample space here. Let $S$ be a sample space. A probability function $P_r: \wp(S)\longrightarrow\mathbb{R}$ assigns probabilities to events. Any probability function must satisfy the axioms of finite probability:

  1. $0\leq P_r(A)\leq 1$ for any event $A$.
  2. $P_r(\emptyset)=0$ and $P_r(S)=1$.
  3. For any events $A$ and $B$ with $A\cap B=\emptyset$, $$P_r(A\cup B)=P_r(A)+P_r(B)$$

Remark. Because of axiom 1, the codomain of a probability function $P_r$ can be restricted to the unit interval $[0,1]$, i.e. $P_r:\wp(S)\longrightarrow[0,1]$.

Theorem. The probability of an event $A$ not happening is $$P_r(A^c)=1-P_r(A)$$

Proof. It follows from $S=A\dot\cup A^c$ and axiom 3. Here, $\dot\cup$ means disjoint union.

Theorem. Let $A$ and $B$ be events such that $A\subseteq B$. Then $P_r(A)\leq P_r(B)$.

Proof. The set $B$ cam be written as $B=A\dot\cup(B-A)$, thus by axiom 3 $$P_r(B)=P_r(A)+P_r(B-A)\geq P_r(A)$$

This theorem is extended to

Theorem. Let $A_1,A_2,\cdots,A_n$ be events such that $A_i\cap A_j=\emptyset$ for $i\ne j$. Then $$P_r\left(\bigcup A_i\right)=\sum P_r(A_i)$$

Let $A$ and $B$ be two events and assume that $A$ and $B$ are not necessarily disjoint. How do we calculate $P_r(A\cup B)$, i.e. the probability of the event $A$ or $B$ happening? The following theorem answers that.

Theorem. (Inclusion-Exclusion Principle) Let $A$ and $B$ be two events (not necessarily disjoint). Then $$P_r(A\cup B)=P_r(A)+P_r(B)-P_r(A\cap B)$$

Proof. \begin{align*}A&=(A-B)\dot\cup(A\cap B)\\B&=(B-A)\dot\cup(A\cap B)\end{align*} Hence by axiom 3, we have \begin{align*}P_r(A-B)&=P_r(A)-P_r(A\cap B)\\P_r(B-A)&=P_r(B)-P_r(A\cap B)\end{align*} Also $A-B$, $B-A$ and $A\cap B$ are mutually disjoint, so by the previous theorem we have \begin{align*}P_r(A\cup B)&=P_r(A-B)+P_r(B-A)+P(A\cap B)\\&=P_r(A)+P_r(B)-P_r(A\cap B)\end{align*}

Consider a trial with $n$ possible outcomes. Assume that the outcomes are equally likely. Examples of trials with equally likely outcomes include coin tossing, rolling a die, drawing marbles from a jar, etc. Let $S$ be the sample space of a trial with equally likely outcomes. Define a function $P_r: \wp(S)\longrightarrow[0,1]$ by $$P_r(E)=\frac{|E|}{|S|}$$ for event $E$. Then it is straightforward to show that $P_r$ satiesfies axioms 1-3, i.e. it is a probability function.

Example. If a fair coin is flipped 4 times, what is the probability that exactly 2 of the flip’s land heads up?

Solution. $|S|=2^4=16$. Let $E$ be the set of outcomes that contain exactly 2 heads. Then $|E|=\begin{pmatrix}4\\2\end{pmatrix}=6$. Hence, $P_r(E)=\frac{6}{16}=\frac{3}{8}$.

Example. The Gambler’s Fallacy. Let $n$ be a large number. If a coin is flipped $n$ times and comes up heads the first $n-1$ times, what is the probability that the $n$th flip comes up head?

Solution. Although your intuition might tell you that the coin is due to come up tail, the coin doesn’t remember the history of previous outcomes. The probability is still $\frac{1}{2}$.

When the knowledge of whether an event $A$ occurred gives no information about whether another event $B$ occurred, $A$ and $B$ are said to be independent. Mathematically, the events $A$ and $B$ are independent if and only if $$P_r(A\cap B)=P_r(A)P_r(B)$$

Example. Let us consider $n=4$ case in the previous example. Let $A$ be the event that a coin is flipped 4 times and comes up heads first three times. Let $B$ be the event that a coin is flipped 4 times and comes up head last time. We show that $A$ and $B$ are independent events. Then \begin{align*}A&=\{HHHH,HHHT\}\\B&=\{HHHH,HHTH,HTHH,THHH,THTH,TTHH,TTTH\}\end{align*} Hence, \begin{align*}P_r(A\cap B)&=\frac{|A\cap B|}{|S|}=\frac{1}{16}\\P_r(A)P_r(B)&=\frac{|A|}{|S|}\frac{|B|}{|S|}=\frac{2}{16}\frac{8}{16}=\frac{1}{16}\end{align*} Since $P_r(A\cap B)=P_r(A)P_r(B)$, $A$ and $B$ are independent.

Example. Suppose three marbles are drawn from a jar containing six red marbles and four blue marbles. Let us consider the following events.

  • $A$: The three marbles are not all the same color.
  • $B$: The first marble drawn is red.

Are $A$ and $B$ independent?

Solution. The number of ways of drawing 3 marbles of the same color is $$\begin{pmatrix}6\\3\end{pmatrix}+\begin{pmatrix}4\\3\end{pmatrix}=24$$ So the number of ways of choosing 3 marbles that are not all the same color, i.e. $|A|=\begin{pmatrix}10\\3\end{pmatrix}-24=120-24=96$. Hence, $P_r(A)=\frac{96}{120}=\frac{4}{5}$. The probability that the first marble drawn is red is just $P_r(B)=\frac{6}{10}=\frac{3}{5}$. To calculate $P_r(A\cap B)$, since first marble is drawn red, the remaining two marbles drawn must contain at least one blue marble. There are $\begin{pmatrix}5\\2\end{pmatrix}=10$ ways of getting 2 more red marbles and so there are $\begin{pmatrix}9\\2\end{pmatrix}-10=36-10=26$ ways of drawing at least one blue marble. Therefore, $P_r(A\cap B)=\frac{6}{10}\frac{26}{36}=\frac{13}{30}$. On the other hand, $P_r(A)P_r(B)=\frac{4}{5}\frac{3}{5}=\frac{12}{25}$. Since $P_r(A\cap B)\ne P_r(A)P_r(B)$, the events $A$ and $B$ are not independent.

References.

[1] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019

Introductory Combinatorics: Combinations 2

Let us begin with an example.

Example 1. A jar contains 11 marbles. In how many ways can they be placed in four boxes, with 3 marbles in the first box, 4 marbles in the second box, 2 marbles in the third box, and the 2 remaining marbles in the fourth box?

Solution. The total number of outcomes is $$\begin{pmatrix}11\\3\end{pmatrix}\cdot \begin{pmatrix}8\\4\end{pmatrix}\cdot \begin{pmatrix}4\\2\end{pmatrix}\cdot \begin{pmatrix}2\\2\end{pmatrix}=\frac{11!}{3!8!}\cdot\frac{8!}{4!4!}\cdot\frac{4!}{2!2!}\cdot\frac{2!}{2!0!}=\frac{11!}{3!4!2!2!}=69,300$$

This example can be generalized to the following question. A set of $n$ distinct items is to be divided into $r$ distinct groups of respective sizes $n_1, n_2, \cdots, n_r$ such that $\sum_{i=1}^r n_i=n$. How many different divisions are possible? There are $\begin{pmatrix}n\\n_1\end{pmatrix}$ possible choices for the first group, there are $\begin{pmatrix}n-n_1\\n_2\end{pmatrix}$ possible choices for the second group, there are $\begin{pmatrix}n-n_1-n_2\\n_3\end{pmatrix}$ possible choices for the third group, …, there are $\begin{pmatrix}n-n_1-n_2-\cdots-n_{r-1}\\n_r\end{pmatrix}$ possible choices for the $r$-th group. Therefore the total number of coutcomes is \begin{align*}\begin{pmatrix}n\\n_1\end{pmatrix}\begin{pmatrix}n-n_1\\n_2\end{pmatrix}\begin{pmatrix}n-n_1-n_2\\n_3\end{pmatrix}\cdots\begin{pmatrix}n-n_1-n_2-\cdots -n_{r-1}\\n_r\end{pmatrix}\\=\frac{n!}{(n-n_1)!n_1!}\frac{(n-n_1)!}{(n-n_1-n_2)!n_2!}\cdots\frac{(n-n_1-n_2\cdots-n_{r-1})!}{0!n_r!}\\=\frac{n!}{n_1!n_2!\cdots n_r!}\end{align*}

The number of divisions of $n$ distinct items into $r$ distinct groups of respective sizes $n_1,n_2,\cdots,n_r$ is denoted by $\begin{pmatrix}n\\n_1,n_2,\cdots,n_r\end{pmatrix}$. That is, \begin{equation}\label{eq:multcoeff}\begin{pmatrix}n\\n_1,n_2,\cdots,n_r\end{pmatrix}=\frac{n!}{n_1!n_2!\cdots n_r!}\end{equation} where $n=n_1+n_2+\cdots+n_r$.

Example 2. A police department in a small city consists of 10 officers. If the department policy is to have 5 of the officers patrolling the streets, 2 of the officers working full time at the station, and 3 of the officers on reserve at the station, how many different divisions of the 10 offices into the 3 groups are possible?

Solution. $\frac{10!}{5!2!3!}=2,520$.

Example 3. Ten children are to be divided into an A team and a B team of 5 each. The A team and a B team of 5 each. The A team will play in one league and the B team in another. How many different divisions are possible?

Solution. $\frac{10!}{5!5!}=252$.

Example 4. In order to play a game of basketball, 10 children at a playground divide themselves into two teams of 5 each. How many different divisions are possible?

Solution. Compare Example 4 with Example 3. In this example, the order of the two teams is irrelevant. Hence, the answer is $\frac{\frac{10!}{5!5!}}{2!}=126$.

Let us consider more about the cases like Example 4. In Example 1, it was implicitly indicated that the four boxes are distinguishable as they are specified as the first box, the second box, and so on so forth. For example, if we assume that the marbles were labeled 1 through 11, the following two would be two different outcomes: \begin{align*}\{1,2,3\},\{4,5,6,7\},\{8,9\},\{10,11\}\\\{1,2,3\},\{4,5,6,7\},\{10,11\},\{8,9\}\end{align*} because the marbles placed in the third and the fourth boxed are swapped in the second row. But what if these boxes were indistinguishable? Then the two outcomes would’ve been regarded as the same.

Example. A jar contains 11 marbles. In how many ways can they be placed in 4 unmarked boxes, with 1 box containing 3 marbles, 1 box containing 4 marbles, and 2 boxes containing 2 marbles each?

Solution. We first choose the marbles for each box as in the case boxes are ordered just like Example 1: $$\begin{pmatrix}11\\3\end{pmatrix}\cdot\begin{pmatrix}8\\4\end{pmatrix}\cdot\begin{pmatrix}4\\2\end{pmatrix}\cdot\begin{pmatrix}2\\2\end{pmatrix}=\frac{11!}{3!4!2!2!}$$ Now, any boxes that have the same number of marbles can be swapped with each other. So we will have to divide the above total by $1!1!2!$ to get the final result: $$\frac{\frac{11!}{3!4!2!2!}}{1!1!2!}=34,650$$

The formula in \eqref{eq:multcoeff} is called multinomial coefficient as it is related to the Multinomial Theorem.

Theorem. \begin{equation}\label{eq:multinomial}(x_1+x_2+\cdots+x_r)^n=\sum_{(n_1,\cdots,n_r)\\n_1+n_2+\cdots+n_r=n}\begin{pmatrix}n\\n_1,n_2,\cdots,n_r\end{pmatrix}x_1^{n_1}x_2^{n_2}\cdots x_r^{n_r}\end{equation}

Example. Expand $(x_1+x_2+x_3)^2$.

Solution. There are 6 partitions of 2: \begin{align*}2&=2+0+0=0+2+0=0+0+2\\&=1+1+0=1+0+1=0+1+1\end{align*} Thus, \begin{align*}(x_1+x_2+x_3)^2&=\begin{pmatrix}2\\2,0,0\end{pmatrix}x_1^2x_2^0x_3^0+\begin{pmatrix}2\\0,2,0\end{pmatrix}x_1^0x_2^2x_3^0+\begin{pmatrix}2\\0,0,2\end{pmatrix}x_1^0x_2^0x_3^2\\&+\begin{pmatrix}2\\1,1,0\end{pmatrix}x_1^1x_2^1x_3^0+\begin{pmatrix}2\\1,0,1\end{pmatrix}x_1^1x_2^0x_3^1+\begin{pmatrix}2\\0,1,1\end{pmatrix}x_1^0x_2^1x_3^1\\&=x^1_2+x_2^2+x_3^2+2x_1x_2+2x_1x_3+2x_2x_3\end{align*}

By the basic principle of counting, there are $r^n$ possible outcomes when $n$ distinguishable balls are to be distributed into $r$ distinguishable urns. What is the balls are indistinguishable? Let us consider a vector $(x_1,x_2,\cdots,x_r)$, where $x_i$ represents the number of balls that are distributed into the $i$-th urn. Then the problem boils down to counting the number of vectors $(x_1,x_2,\cdots,x_r)$ such that the $x_i$’s are nonnegative, integer-valued and $x_1+x_2+\cdots+x_r=n$. First let us consider the number of such vectors with the $x_i$’s being all positive integer-valued. Imagine that we lined up $n$ balls and divide them into $r$ nonempty groups. For $n=8$ and $r=3$, the following is an example of the divisions: ooo|ooo|oo. There are 7 spaces between 8 balls so the number of all possible 3 such divisions is $\begin{pmatrix}7\\2\end{pmatrix}$. By a similar arugument, we obtain:

Proposition 1. There are $\begin{pmatrix}n-1\\r-1\end{pmatrix}$ distinct vectors $(x_1,\cdots,x_r)$ such that the $x_i$’s are all positive, integer-valued and $x_1+\cdots+x_r=n$.

Now suppose that the $x_i$’s are nonnegative (positive or zero). Let $y_i=x_i+1$. Then $y_i>0$ and $y_1+y_2+\cdots+y_r=n+r$. It follows from the above proposition that:

Proposition 2. There are $\begin{pmatrix}n+r-1\\r-1\end{pmatrix}$ distinct vectors $(x_1,\cdots,x_r)$ such that the $x_i$’s are nonnegative, integer-valued and $x_1+\cdots+x_r=n$.

Example. How many distinct solutions of $x_1+x_2=3$, where $x_1$ and $x_2$ are nonnegative and integer-valued, are possible?

Solution. $\begin{pmatrix}3+2-1\\2-1\end{pmatrix}=4$. In fact, they are $(0,3)$, $(1,2)$, $(2,1)$, $(3,0)$.

Example. An investor has \$20,000 to invest among 4 possible investments. Each investment must be in units of a thousand dollars. If the total $20,000 is to be invested, how many different investment strategies are possible? What if not all the money need to be invested?

Solution. Let $x_i$ denote the number of thousands in dollars invested in ivestment number $i$. Then $$x_1+x_2+x_3+x_4=20,\ x_i\geq 0$$ and there are $\begin{pmatrix}20+4-1\\4-1\end{pmatrix}=1,771$ possible investment strategies. If not all the money need to be invested, let $x_5$ denote the amount kept in reserve. Now, a strategy is a vector $(x_1,x_2,x_3,x_4,x_5)$ such that the $x_i$’s are nonnegative, integer-valued and $x_1+x_2+x_3+x_4+x_5=20$. Hence, there are $\begin{pmatrix}20+5-1\\5-1\end{pmatrix}=10,626$ possible strategies.

Remark. Using proposition 2, we can also answer how many terms there are in the multinomial expansion of $(x_1+x_2+\cdots+x_r)^n$. In the multinomial expansion formula \eqref{eq:multinomial}, $n_1+n_2+\cdots+n_r=n$. Hence there are $\begin{pmatrix}n+r-1\\r-1\end{pmatrix}$ terms.

References: Not in particular order.

[1] Applied Discrete Structures, Al Doerr and Ken Levasseur. This book is available for free under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States License.

[2] A First Course in Probability, Sheldon Ross, 5th Edition, Prentice-Hall, 1998

[3] Essential Discrete Mathematics for Computer Science, Harry Lewis and Rachel Zax, Princeton University Press, 2019