# Dominated Convergence Theorem

In measure theory, the dominated convergence theorem is a cornerstone of Lebesgue integration. It can be viewed as a culmination of all efforts, and is a general statement about the interplay between limits and integrals.

## Statement Theorem

Consider the measure space $(X,{\mathcal {M}},\lambda )$ . Suppose $\{f_{n}\}$ is a sequence in $L^{1}(\lambda )$ such that

1. $f_{n}\to f$ a.e
2. there exists $g\in L^{1}(\lambda )$ such that $|f_{n}|\leq g$ a.e. for all $n\in \mathbb {N}$ Then $f\in L^{1}(\lambda )$ and $\int f=\lim _{n\to \infty }\int f_{n}$ . 

## Proof of Theorem

$f$ is a measurable function in the sense that it is a.e. equal to a measurable function, since it is the limit of $f_{n}$ except on a null set. Also $|f|\leq g$ a.e., so $f\in L^{1}(\lambda )$ .

Now we have $g+f_{n}\geq 0$ a.e. and $g-f_{n}\geq 0$ a.e. to which we may apply Fatou's lemma to obtain

$\int g+\int f=\int \lim _{n\to \infty }(g+f_{n})\leq \liminf _{n\to \infty }\int (g+f_{n})=\int g+\liminf _{n\to \infty }\int f_{n}$ ,

where the equalities follow from linearity of the integral and the inequality follows from Fatou's lemma. We similarly obtain

$\int g-\int f=\int \lim _{n\to \infty }(g-f_{n})\leq \liminf _{n\to \infty }\int (g-f_{n})=\int g-\limsup _{n\to \infty }\int f_{n}$ .

Since $\int g<+\infty$ , these imply

$\limsup _{n\to \infty }\int f_{n}\leq \int f\leq \liminf _{n\to \infty }\int f_{n}$ from which the result follows.  

## Applications of Theorem

1. Suppose we want to compute $\lim _{n\to \infty }\int _{[0,1]}{\frac {1+nx^{2}}{(1+x^{2})^{n}}}$ .  Denote the integrand $f_{n}$ and see that $|f_{n}|\leq 1$ for all $n\in \mathbb {N}$ and $1_{[0, 1]} \in L^1(\lambda)$. Note we only consider the constant function $1$ on $[0, 1]$. Applying the dominated convergence theorem, this allows us the move the limit inside the integral and compute it as usual.
1. Using the theorem, we know there does not exist a dominating function for the sequence $f_{n}$ defined by $f_{n}(x)=n1_{[0,{\frac {1}{n}}]}$ because $f_{n}\to 0$ pointwise everywhere and $\lim _{n\to \infty }\int f_{n}=1\neq 0=\int \lim _{n\to \infty }f_{n}$ . 

## Another Application: Stirling's Formula

Stirling's formula states that

$n!\sim {\sqrt {2\pi n}}n^{n}e^{-n}$ as $n\rightarrow \infty$ . We offer a proof here which relies on the Dominated Convergence Theorem.

Proof: Repeated integration by parts yields the formula

$n!=\int _{0}^{\infty }t^{n}e^{-t}\ dt$ We shall estimate the integral above. Making the variable change $t=n+s$ yields
$\int _{-n}^{\infty }(n+s)^{n}e^{-n-s}\ ds$ Simplifying, this becomes
$n^{n}e^{-n}\int _{-n}^{\infty }\left(1+{\frac {s}{n}}\right)^{n}e^{-s}\ ds$ Combining the integrand into a single exponential,
$n^{n}e^{-n}\int _{-n}^{\infty }\exp \left(n\log \left(1+{\frac {s}{n}}\right)-s\right)\ ds$ We want to show that this integral is asymptotic to the Gaussian. To this end, make the scaling substitution $s={\sqrt {n}}x$ to obtain
${\sqrt {n}}n^{n}e^{-n}\int _{-{\sqrt {n}}}^{\infty }\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\ dx$ Since the function $n\log \left(1+{\frac {x}{\sqrt {n}}}\right)$ equals zero and has derivative ${\sqrt {n}}$ at the origin, and has second derivative ${\frac {-1}{(1+x/{\sqrt {n}})^{2}}}$ , applying the fundamental theorem of calculus twice yields
$n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x=-\int _{0}^{x}{\frac {x-y}{1+y/{\sqrt {n}}}}\ dy$ As a consequence we have the upper bounds
$n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\leq -cx^{2}$ for some $c>0$ when $x\leq {\sqrt {n}}$ and
$n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\leq -c|x|{\sqrt {n}}$ when $|x|>{\sqrt {n}}$ . These bounds keep the exponential in the integrand $\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)$ bounded by an $L^{1}$ function. By the Dominated Convergence Theorem,
$\lim _{n\to \infty }\int _{-{\sqrt {n}}}^{\infty }\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\ dx=\int _{-\infty }^{\infty }\exp \left(-{\frac {x^{2}}{2}}\right)\ dx$ where the pointwise convergence
$\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\rightarrow \exp \left(-{\frac {x^{2}}{2}}\right)$ can be arrived at for all $x$ by expanding the Taylor series of the logarithm. The final integral is a classic calculus integral which can be computed to equal ${\sqrt {2\pi }}$ . This proves Stirling's formula. See  for a more motivated account of this proof.