Lecture 14. C) Convergence in Distribution
Convergence in Distribution
A sequence of random variables [math]X_{n}[/math] converges in distribution to a random variable [math]Y[/math] if, at all continuity points of [math]F_{Y}\left(y\right)[/math],
[math]\lim_{n\rightarrow\infty}\,F_{X_{n}}\left(y\right)=F_{Y}\left(y\right)[/math]
Convergence in distribution is very intuitive: As [math]n\rightarrow\infty[/math], the distributions of [math]X_{n}[/math] converge point-by-point to that of [math]Y[/math].
Example
The reason we only require the limit condition to hold only at continuity points of [math]F_{Y}[/math] can be illustrated with an example (the specific reasoning is a bit deeper, but we won’t concern ourselves with that).
Let [math]f_{X_{N}}=\begin{cases} n, & 0\lt x\lt \frac{1}{n}\\ 0, & otherwise \end{cases}[/math]
Clearly, as [math]n\rightarrow\infty[/math], [math]X_{n}[/math] is approaching a mass point at zero, so we may want it to converge in distribution to [math]Y[/math], where [math]P\left(Y=0\right)=1[/math]. However, the value of [math]F_{X_{N}}\left(0\right)[/math] is zero for all [math]n[/math], and so [math]\lim_{n\rightarrow\infty}F_{X_{N}}\left(0\right)\neq F_{Y}\left(0\right)=1[/math].
By not requiring that the limit condition holds at [math]F_{Y}\left(0\right)[/math], we do satisfy convergence in distribution, i.e., [math]X_{n}\overset{d}{\rightarrow}Y_{n}[/math].