The Distribution
Jacob Bernoulli |
Going back to our coin, where X has two possible values X∈{0,1}, the probabilities for each value are defined by:
P(X =0 0) = 1 − p, P(X = 1) = p
In the case of an unbiased coin, p = 0.5 and both possible events have the same chance of occurrence.
Moments of the Bernoulli Distribution
<X> = 0⋅(1 − p) + 1⋅p = p
The second moment or expected value of the square of X is:
<X2> = 02⋅(1 − p) + 12⋅p = p
Therefore, the variance for a variable distributed according to the Bernoulli distribution is:
The second moment or expected value of the square of X is:
<X2> = 02⋅(1 − p) + 12⋅p = p
Therefore, the variance for a variable distributed according to the Bernoulli distribution is:
σ2 = <X>2 − <X2> = p - p2 = p⋅(1 − p)
Considering the variance as a function of p:
f(p) = p⋅(1 − p)
It is easy to see that the value of p making the variance a maximum (p_max) is:
f'(p_max) = 0 = -2⋅p_max +1 ∴ p_max = 0.5
Where f'(p_max) stands for the derivative of f respect to p evaluated at p_max (the maximum).
Considering the variance as a function of p:
f(p) = p⋅(1 − p)
It is easy to see that the value of p making the variance a maximum (p_max) is:
f'(p_max) = 0 = -2⋅p_max +1 ∴ p_max = 0.5
Where f'(p_max) stands for the derivative of f respect to p evaluated at p_max (the maximum).
No comments:
Post a Comment