Intersection of Several Events
Intersection of Several Events¶
By now you have seen many examples of the following kind:
A deck of cards consists of 26 red cards and 26 black cards. Three cards are drawn at random without replacement. What is the chance that they are all black?
You will be quick to answer $$ \frac{26}{52} \cdot \frac{25}{51} \cdot \frac{24}{50} $$
and you will be right. But where does that multiplication of three factors come from? The multiplication rule says only that if $A$ and $B$ are two events, then
$$ P(AB) = P(A)P(B \mid A) $$Can we just go ahead and extend it to three or more events, and if so, exactly how are we extending it?
The third factor in the product above is the conditional chance that the third card is red given that the first two cards were red. This suggests an extension of the multiplication rule, used by us without discussion and without proof in many calculations so far:
$$ P(A_1A_2 \ldots A_n) = P(A_1)P(A_2 \mid A_1) P(A_3 \mid A_1A_2)P(A_4 \mid A_1A_2A_3) \cdots P(A_n \mid A_1A_2 \ldots A_{n-1}) $$Let's prove this by induction. While it might seem a little pedantic, it's a good idea to develop skills that you can use to solidify the foundations of steps that are based on intuition.
We will start in the case $n=3$ to see what is going on. First notice that
$$ P(A_2A_3 \mid A_1) = \frac{P(A_1A_2A_3)}{P(A_1)} = \frac{P(A_1A_2)P(A_3 \mid A_1A_2)}{P(A_1)} = P(A_2 \mid A_1)P(A_3 \mid A_1A_2) $$This is just like the ordinary multiplication rule for the intersection of the two events $A_2$ and $A_3$, except that all the probabilities are also conditional given $A_1$.
And now
\begin{align*} P(A_1A_2A_3) &= P(A_1)P(A_2A_3 \mid A_1) ~~~ \text{(multiplication rule applied to }A_1 \text{ and } A_2A_3 \text{)} \\ &= P(A_1)P(A_2 \mid A_1)P(A_3 \mid A_1A_2) ~~~ \text{(we just proved this)} \end{align*}So our result is true for $n=3$.
To prove it for all positive integers, first assume the induction hypothesis that the result is true for $n$:
$$ P(A_1A_2 \ldots A_n) = P(A_1)P(A_2 \mid A_1) P(A_3 \mid A_1A_2) \cdots P(A_n \mid A_1A_2 \ldots A_{n-1}) $$And then show that it is true for $n+1$:
\begin{align*} & P(A_1A_2 \ldots A_nA_{n+1}) \\ &= P(A_1)P(A_2 \mid A_1) P(A_3 \mid A_1A_2) \cdots P(A_n \mid A_1A_2 \ldots A_{n-1}) P(A_{n+1} \mid A_1A_2 \ldots A_{n-1}A_n) \end{align*}The induction relies on treating $A_1A_2 \dots A_nA_{n+1}$ as the intersection of two events: $B_n = A_1A_2 \ldots A_n$ and $A_{n+1}$. Moves like this are use in many induction proofs of results about chances of intersections and unions.
Now
\begin{align*} P(A_1A_2 \ldots A_{n+1}) &= P(B_nA_{n+1}) \\ &= P(B_n)P(A_{n+1} \mid B_n) ~~~ \text{(multiplication rule)} \\ &= P(A_1A_2 \ldots A_n)P(A_{n+1} \mid A_1A_2 \ldots A_n) ~~~ \text{(definition of } B_n \text{)} \\ &= P(A_1)P(A_2 \mid A_1) P(A_3 \mid A_1A_2) \cdots P(A_n \mid A_1A_2 \ldots A_{n-1})P(A_{n+1} \mid A_1A_2 \ldots A_n) \end{align*}by the induction hypothesis. Done!
You shouldn't feel you have to have proved every single result that you use, specially when they arise naturally as properties of proportions. This section shows you that there are straightforward if laborious ways of establising such results mathematically.
However, what is "natural" and what is correct are not always the same. Here is an example.
Pairwise and Mutual Independence¶
We have defined the independence of two events to mean that the chance of one doesn't change if you are told that other has happened; $A$ and $B$ are independent means that
$$ P(B \mid A) = P(B) ~~~ \text{or equivalently,} ~~~ P(AB) = P(A)P(B) $$Now suppose you have three events $A, B$, and $C$, and suppose each pair of them is independent by the defintion above. That is, $A$ is independent of $B$, $A$ is independent of $C$, and $B$ is independent of $C$.
This is called pairwise independence, and you might be tempted to use it as a definition of independence of three events. But it doesn't quite work.
In a group of three people, let $B_{ij}$ be the event that Persons $i$ and $j$ have the same birthday, and let $B_{123}$ be the event that all three have the same birthday. Under the assumptions of randomness that we made for the classical Birthday Problem, we know that
$$ P(B_{12}) = \frac{1}{365} = P(B_{23}) = P(B_{13}) $$and $$ P(B_{12} B_{23}) = P(B_{123}) = \frac{1}{365} \cdot \frac{1}{365} = P(B_{12})P(B_{23}) $$
So $B_{12}$ and $B_{23}$ are independent. In the same way you can show that $B_{12}$ and $B_{13}$ are independent, as are $B_{23}$ and $B_{13}$. Thus the three events $B_{12}$, $B_{13}$, and $B_{23}$ are pairwise independent.
But $$ P(B_{13} \mid B_{12}B_{23}) = 1 \ne P(B_{13}) $$
Given that Persons 1 and 2 have the same birthday and that Persons 2 and 3 have the same birthday, there is no randomness left in whether Persons 1 and 3 have the same birthday – they just do. Information about the other two pairs affects the chance of $B_{13}$.
This goes against what independence should mean, and points out to us that when we have more than two events we have to be careful about the defining independence.
Mutual Independence¶
Events $A_1, A_2, \ldots A_n$ are mutually independent (or independent for short) if given that any subset of the events has occurred, the conditional chances of all other subsets remain unchanged.
That's quite a mouthful. In practical terms it means that it doesn't matter which of the events you know have happened; chances involving the remaining events are unchanged.
In terms of random variables, $X_1, X_2, \ldots , X_n$ are independent if given the values of any subset, chances of events determined by the remaining variables are unchanged.
In practice, this just formalizes statements such as "results of different tosses of a coin are independent" or "draws made at random with replacement are independent".
Try not to become inhibited by the formalism. Notice how the theory not only supports intuition but also develops it. You can expect your probabilistic intuition to be much sharper at the end of this course than it is now!