Theorem. (The Central Limit Theorem)
Let $X_1,X_2,\cdots$ be a sequence of independent and identically distributed random variables each having mean $\mu$ and variance $\sigma^2$. Then the distribution of
$$\frac{X_1+\cdots+X_n-n\mu}{\sigma\sqrt{n}}$$
tend to the standard normal as $n\to\infty$. That is, for $-\infty<a<\infty$,
$$P\left\{\frac{X_1+X_2+\cdots+X_n-n\mu}{\sigma\sqrt{n}}\leq a\right\}\to\frac{1}{\sqrt{2\pi}}\int_{-\infty}^a e^{-\frac{x^2}{2}}dx$$
as $n\to\infty$.
The central limit theorem is one of the most remarkable results in probability theory.
We introduce the following lemma without proof, which is crucial in proving the central limit theorem.
Lemma. Let $Z_1, Z_2,\cdots$ be a sequence of random variables having distribution functions $F_{Z_n}$ and moment generating functions $M_{Z_n}$, $n\geq 1$; and let $Z$ be a random variable having distribution function $F_Z$ and moment generating function $M_Z$. If $M_{Z_n}(t)\to M_Z(t)$ for all $t$, then $F_{Z_n}(t)\to F_Z(t)$ for all $t$ at which $F_Z(t)$ is continuous.
Example. If $Z$ is a unit normal random variable (i.e. $\mu=0$ and $\sigma^2=1$), then $M_Z(t)=e^{\frac{t^2}{2}}$. It follows from the lemma that if $M_{Z_n}(t)\to e^{\frac{t^2}{2}}$ as $n\to\infty$, then $F_{Z_n}(t)\to\Phi(t)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^t e^{\frac{-y^2}{2}}dy$.
We now prove the central limit theorem.
Proof. Let us assume at first that $\mu=0$ and $\sigma^2=1$. Also assume that the moment generating function of the $X_i$, $M(t)$ exists and is finite. (Since the random variables $X_1,X_2,\cdots$ are identically distributed, they have the same distributions and hence their moment generating functions are the same.) Let
$$Z_n=\frac{X_1+\cdots+X_n}{\sqrt{n}}=\sum_{i=1}^n\frac{X_i}{\sqrt{n}},\ n=1,2,\cdots$$
Then the moment generating function $M_{Z_n}(t)$ is
\begin{align*} M_{Z_n}(t)&=E\left[\exp\left\{\sum_{i=1}^n\frac{tX_i}{\sqrt{n}}\right\}\right]\\ &=E\left[\prod_{i=1}^n\exp\left\{\frac{tX_i}{\sqrt{n}}\right\}\right]\\ &=\prod_{i=1}^nE\left[\exp\left\{\frac{tX_i}{\sqrt{n}}\right\}\right]\\ (\mbox{the $X_i$’s are independent}) \end{align*}
$E\left[\exp\left\{\frac{tX_i}{\sqrt{n}}\right\}\right]$ is the moment generating function of $\frac{X_i}{\sqrt{n}}$ and
$$E\left[\exp\left\{\frac{tX_i}{\sqrt{n}}\right\}\right]=M\left(\frac{t}{\sqrt{n}}\right)$$
Thus,
$$M_{Z_n}(t)=\left[M\left(\frac{t}{\sqrt{n}}\right)\right]^n$$
Let $L(t)=\log M(t)$. Then
\begin{align*} L(0)&=0\\ L'(0)&=\mu=0\\ L^{\prime\prime}(0)&=\sigma^2=1 \end{align*}
Using these values and L’Hôpital’s rule, one can easily show that $nL\left(\frac{t}{\sqrt{n}}\right)\to\frac{t^2}{2}$ as $n\to\infty$, or equivalently, $M_{Z_n}\to e^{\frac{t^2}{2}}$ as $n\to\infty$.
Hence, by the lemma, the central limit theorem is proved when $\mu=0$ and $\sigma^2=1$.
For the general case, let
\begin{align*} Z_n&=\frac{X_1+\cdots+X_n-n\mu}{\sqrt{n}\sigma}\\ &=\frac{\sum_{i=1}^n\frac{X_i-\mu}{\sigma}}{\sqrt{n}},\ n=1,2,\cdots \end{align*}
Let $X_i^\ast=\frac{X_i-\mu}{\sigma}$. Then $Z_n=\frac{\sum_{i=1}^n X_i^\ast}{\sqrt{n}}$ and it can be easily verified that $E[X_i^\ast]=0$ and $\mathrm{var}(X_i^\ast)=1$. For this reason, the $X_i^\ast$’s are called the standardized random variables. Therefore, this completes the proof.
References:
Sheldon Ross, A First Course in Probability, Fifth Edition, Prentice Hall, 1997