Lecture 5. B) Chebychev's Inequality

From Significant Statistics
Jump to navigation Jump to search

Chebychev's Inequality

This inequality establishes a bound on how far the values of an r.v. can be from its mean. Suppose [math]X[/math] is an r.v.. Then, for any [math]r\gt 0[/math] and any [math]g:\mathbb{R}\rightarrow\mathbb{R}_{++}[/math],

[math]P\left(g\left(X\right)\geq r\right)\leq\frac{E\left(g\left(X\right)\right)}{r}[/math].

Proof

The proof is relatively simple. Note that

[math]\begin{aligned} & \forall x\in\mathbb{R},\,r1\left(g\left(x\right)\geq r\right)\leq g\left(x\right)\\ \Leftrightarrow & \forall x\in\mathbb{R},1\left(g\left(x\right)\geq r\right)\leq\frac{g\left(x\right)}{r}\\ \Rightarrow & \underset{=P\left(g\left(X\right)\geq r\right)}{\underbrace{E\left[1\left(g\left(X\right)\geq r\right)\right]}}\leq E\left(\frac{g\left(X\right)}{r}\right)\end{aligned}[/math]

Implications

From this result, we can derive some popular implications:

  • [math]P\left(\left|X\right|\geq r\right)\leq\frac{E\left(\left|X\right|\right)}{r},\,\forall r\gt 0[/math] - here, [math]g\left(x\right)=\left|X\right|[/math]. This is often referred to as Markov’s inequality.
  • [math]P\left(\left|X-\mu\right|\geq r\right)\leq\frac{E\left(\left|X-\mu\right|\right)}{r},\,\forall r\gt 0[/math] - here, [math]g\left(x\right)=\left|X-\mu\right|[/math].
  • From the previous identity, we can obtain an expression that relates mean to variance:

[math]\begin{aligned} & P\left(\left|X-\mu\right|\geq\varepsilon\right)\\ = & P\left(\left(X-\mu\right)^{2}\geq\varepsilon^{2}\right)\leq\frac{E\left(\left(X-\mu\right)^{2}\right)}{\varepsilon^{2}}=\frac{Var\left(X\right)}{\varepsilon^{2}}\end{aligned}[/math] which implies that [math]P\left(\left|X-\mu\right|\geq\varepsilon\right)\leq\frac{Var\left(X\right)}{\varepsilon^{2}}[/math].

We have established a bound on how much [math]X[/math] can vary around its mean, as function of its variance.