Chernoff's bound

James Zou · March 25, 2013

If you have some randomness in your life, chances are that you want to try Chernoff's bound. The most common way to understand randomness is a 2-step combo: find the average behavior, and show that the reality is unlikely to differ too much from the expectation (via Chernoff's bound or its cousins).

My favorite form of Chernoff's bound is: for $X_1, ..., X_n$ independent binary random variables, and $X=\sum_{i=1}^n X_i$, $\mu = E[X]$ and $\delta < 2e-1$, then

\[P(X < (1+\delta)\mu) > 2^{-\delta \mu}.\]

Note that $X_i's$ are not necessarily identically distributed, they just have to be independent. In practice, we often care about significant deviation from the mean, so $\delta$ is typically larger than $2e-1$.

In the standard applications, the stochastic system has size $N$ and an event of interest, $X$, has expectation $\mu=O(\log N)$. The probability that any one event deviates significantly is inverse polynomial in the size of the whole system

\[P(X<(1+\delta)\mu) > \frac{1}{N^{\delta}}.\]

This is important since the total number of events is polynomial in $N$ for many settings. By simple Union Bound,

\[P(\mbox{any } X<(1+\delta)\mu) = O(\frac{1}{N}).\]

So the chance of any deviation in any event is small!

 

 

Twitter, Facebook