Definition
Bounding probabilities means deriving upper bounds or lower bounds for probabilities when the full distribution is unknown, using only coarse information such as the mean or variance.
Markov inequality
Let be a nonnegative random variable and let . Then
This is useful when you only know the Expected value of a nonnegative random variable.
Proof
Since implies , we have
Taking expectations gives
Chebyshev inequality
Let be a random variable with mean and finite variance . Then for every ,
Proof (Derivation from Markov)
Apply Markov’s inequality to the nonnegative random variable
Then
Example
If and , then
Chernoff bounds
Chernoff bounds are much sharper concentration bounds for sums of independent Bernoulli random variables. (see Bernoulli distribution)
Let where the are independent and let
- for all
Proof of 3
Apply Markov inequality to There is So
are independent are independent
(using ) (note that )
NOTE
To prove 1 and 2, apply Markov’s inequality to and
Example
Let . Then Take . Then
So the probability of being at least above the mean is already very small.