Showing posts with label Bernoulli Distribution. Show all posts
Showing posts with label Bernoulli Distribution. Show all posts

Tuesday, April 22, 2014

Bernoulli Process and Coin Tosses

The Distribution

Jacob Bernoulli
The Bernoulli distribution is an example of a discrete distribution. The paradigm is to observe the result X of a coin toss, where X is 1 if the result is heads or 0 otherwise. Then, by definition X has a Bernoulli distribution. This distribution is a good model for any experiment where we are interested to know whether a particular event happened or not.

Going back to our coin, where X has two possible values X∈{0,1}, the probabilities for each value are defined by:

P(X =0 0) = 1 − p, P(X = 1) = p

In the case of an unbiased coin, p = 0.5 and both possible events have the same chance of occurrence.

Moments of the Bernoulli Distribution

Variance for the Bernoulli distribution 
The first moment or expected value of the distribution is:

<X> =  0⋅(1 − p) + 1⋅p = p

The second moment or expected value of the square of X is:

<X2> = 02⋅(1 − p) + 12⋅p = p

Therefore, the variance for a variable distributed according to the Bernoulli distribution is:

σ2 = <X>2 − <X2> = p - p2 = p⋅(1 − p)

Considering the variance as a function of p:

f(p) = p⋅(1 − p)

It is easy to see that the value of p making the variance a maximum (p_max) is:

f'(p_max) = 0 = -2⋅p_max +1  ∴  p_max = 0.5

Where f'(p_max) stands for the derivative of f respect to p evaluated at p_max (the maximum).