# Dominated Convergence Theorem

In measure theory, the dominated convergence theorem is a cornerstone of Lebesgue integration. It can be viewed as a culmination of all efforts, and is a general statement about the interplay between limits and integrals.

## Statement Theorem

Consider the measure space ${\displaystyle (X,{\mathcal {M}},\lambda )}$. Suppose ${\displaystyle \{f_{n}\}}$ is a sequence in ${\displaystyle L^{1}(\lambda )}$ such that

1. ${\displaystyle f_{n}\to f}$ a.e
2. there exists ${\displaystyle g\in L^{1}(\lambda )}$ such that ${\displaystyle |f_{n}|\leq g}$ a.e. for all ${\displaystyle n\in \mathbb {N} }$

Then ${\displaystyle f\in L^{1}(\lambda )}$ and ${\displaystyle \int f=\lim _{n\to \infty }\int f_{n}}$. [1]

## Proof of Theorem

${\displaystyle f}$ is a measurable function in the sense that it is a.e. equal to a measurable function, since it is the limit of ${\displaystyle f_{n}}$ except on a null set. Also ${\displaystyle |f|\leq g}$ a.e., so ${\displaystyle f\in L^{1}(\lambda )}$.

Now we have ${\displaystyle g+f_{n}\geq 0}$ a.e. and ${\displaystyle g-f_{n}\geq 0}$ a.e. to which we may apply Fatou's lemma to obtain

${\displaystyle \int g+\int f=\int \lim _{n\to \infty }(g+f_{n})\leq \liminf _{n\to \infty }\int (g+f_{n})=\int g+\liminf _{n\to \infty }\int f_{n}}$,

where the equalities follow from linearity of the integral and the inequality follows from Fatou's lemma. We similarly obtain

${\displaystyle \int g-\int f=\int \lim _{n\to \infty }(g-f_{n})\leq \liminf _{n\to \infty }\int (g-f_{n})=\int g-\limsup _{n\to \infty }\int f_{n}}$.

Since ${\displaystyle \int g<+\infty }$, these imply

${\displaystyle \limsup _{n\to \infty }\int f_{n}\leq \int f\leq \liminf _{n\to \infty }\int f_{n}}$ from which the result follows. [1] [2]

## Applications of Theorem

1. Suppose we want to compute ${\displaystyle \lim _{n\to \infty }\int _{[0,1]}{\frac {1+nx^{2}}{(1+x^{2})^{n}}}}$. [3] Denote the integrand ${\displaystyle f_{n}}$ and see that ${\displaystyle |f_{n}|\leq 1}$ for all ${\displaystyle n\in \mathbb {N} }$ and $1_{[0, 1]} \in L^1(\lambda)$. Note we only consider the constant function $1$ on $[0, 1]$. Applying the dominated convergence theorem, this allows us the move the limit inside the integral and compute it as usual.
1. Using the theorem, we know there does not exist a dominating function for the sequence ${\displaystyle f_{n}}$ defined by ${\displaystyle f_{n}(x)=n1_{[0,{\frac {1}{n}}]}}$ because ${\displaystyle f_{n}\to 0}$ pointwise everywhere and ${\displaystyle \lim _{n\to \infty }\int f_{n}=1\neq 0=\int \lim _{n\to \infty }f_{n}}$. [4]

## Another Application: Stirling's Formula

Stirling's formula states that

${\displaystyle n!\sim {\sqrt {2\pi n}}n^{n}e^{-n}}$
as ${\displaystyle n\rightarrow \infty }$. We offer a proof here which relies on the Dominated Convergence Theorem.

Proof: Repeated integration by parts yields the formula

${\displaystyle n!=\int _{0}^{\infty }t^{n}e^{-t}\ dt}$
We shall estimate the integral above. Making the variable change ${\displaystyle t=n+s}$ yields
${\displaystyle \int _{-n}^{\infty }(n+s)^{n}e^{-n-s}\ ds}$
Simplifying, this becomes
${\displaystyle n^{n}e^{-n}\int _{-n}^{\infty }\left(1+{\frac {s}{n}}\right)^{n}e^{-s}\ ds}$
Combining the integrand into a single exponential,
${\displaystyle n^{n}e^{-n}\int _{-n}^{\infty }\exp \left(n\log \left(1+{\frac {s}{n}}\right)-s\right)\ ds}$
We want to show that this integral is asymptotic to the Gaussian. To this end, make the scaling substitution ${\displaystyle s={\sqrt {n}}x}$ to obtain
${\displaystyle {\sqrt {n}}n^{n}e^{-n}\int _{-{\sqrt {n}}}^{\infty }\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\ dx}$
Since the function ${\displaystyle n\log \left(1+{\frac {x}{\sqrt {n}}}\right)}$ equals zero and has derivative ${\displaystyle {\sqrt {n}}}$ at the origin, and has second derivative ${\displaystyle {\frac {-1}{(1+x/{\sqrt {n}})^{2}}}}$, applying the fundamental theorem of calculus twice yields
${\displaystyle n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x=-\int _{0}^{x}{\frac {x-y}{1+y/{\sqrt {n}}}}\ dy}$
As a consequence we have the upper bounds
${\displaystyle n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\leq -cx^{2}}$
for some ${\displaystyle c>0}$ when ${\displaystyle x\leq {\sqrt {n}}}$ and
${\displaystyle n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\leq -c|x|{\sqrt {n}}}$
when ${\displaystyle |x|>{\sqrt {n}}}$. These bounds keep the exponential in the integrand ${\displaystyle \exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)}$ bounded by an ${\displaystyle L^{1}}$ function. By the Dominated Convergence Theorem,
${\displaystyle \lim _{n\to \infty }\int _{-{\sqrt {n}}}^{\infty }\exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\ dx=\int _{-\infty }^{\infty }\exp \left(-{\frac {x^{2}}{2}}\right)\ dx}$
where the pointwise convergence
${\displaystyle \exp \left(n\log \left(1+{\frac {x}{\sqrt {n}}}\right)-{\sqrt {n}}x\right)\rightarrow \exp \left(-{\frac {x^{2}}{2}}\right)}$
can be arrived at for all ${\displaystyle x}$ by expanding the Taylor series of the logarithm. The final integral is a classic calculus integral which can be computed to equal ${\displaystyle {\sqrt {2\pi }}}$. This proves Stirling's formula. See [5] for a more motivated account of this proof.

## References

1. Gerald B. Folland, Real Analysis: Modern Techniques and Their Applications, second edition, §2.3
2. Craig, Katy. MATH 201A Lecture 15. UC Santa Barbara, Fall 2020.
3. Gerald B. Folland, Real Analysis: Modern Techniques and Their Applications, second edition, §2.3.28
4. Craig, Katy. MATH 201A Lecture 15. UC Santa Barbara, Fall 2020.
5. Tao, Terence. 254A, Notes 0a: Stirling's Formula. What's New, 2 January 2010.