Intersection of Several Events

Interact

Intersection of Several Events

By now you have seen many examples of the following kind:

A deck of cards consists of 26 red cards and 26 black cards. Three cards are drawn at random without replacement. What is the chance that they are all black?

You will be quick to answer 265225512450

and you will be right. But where does that multiplication of three factors come from? The multiplication rule says only that if A and B are two events, then

P(AB)=P(A)P(BA)

Can we just go ahead and extend it to three or more events, and if so, exactly how are we extending it?

The third factor in the product above is the conditional chance that the third card is red given that the first two cards were red. This suggests an extension of the multiplication rule, used by us without discussion and without proof in many calculations so far:

P(A1A2An)=P(A1)P(A2A1)P(A3A1A2)P(A4A1A2A3)P(AnA1A2An1)

Let's prove this by induction. While it might seem a little pedantic, it's a good idea to develop skills that you can use to solidify the foundations of steps that are based on intuition.

We will start in the case n=3 to see what is going on. First notice that

P(A2A3A1)=P(A1A2A3)P(A1)=P(A1A2)P(A3A1A2)P(A1)=P(A2A1)P(A3A1A2)

This is just like the ordinary multiplication rule for the intersection of the two events A2 and A3, except that all the probabilities are also conditional given A1.

And now

P(A1A2A3)=P(A1)P(A2A3A1)   (multiplication rule applied to A1 and A2A3)=P(A1)P(A2A1)P(A3A1A2)   (we just proved this)

So our result is true for n=3.

To prove it for all positive integers, first assume the induction hypothesis that the result is true for n:

P(A1A2An)=P(A1)P(A2A1)P(A3A1A2)P(AnA1A2An1)

And then show that it is true for n+1:

P(A1A2AnAn+1)=P(A1)P(A2A1)P(A3A1A2)P(AnA1A2An1)P(An+1A1A2An1An)

The induction relies on treating A1A2AnAn+1 as the intersection of two events: Bn=A1A2An and An+1. Moves like this are use in many induction proofs of results about chances of intersections and unions.

Now

P(A1A2An+1)=P(BnAn+1)=P(Bn)P(An+1Bn)   (multiplication rule)=P(A1A2An)P(An+1A1A2An)   (definition of Bn)=P(A1)P(A2A1)P(A3A1A2)P(AnA1A2An1)P(An+1A1A2An)

by the induction hypothesis. Done!

You shouldn't feel you have to have proved every single result that you use, specially when they arise naturally as properties of proportions. This section shows you that there are straightforward if laborious ways of establising such results mathematically.

However, what is "natural" and what is correct are not always the same. Here is an example.

Pairwise and Mutual Independence

We have defined the independence of two events to mean that the chance of one doesn't change if you are told that other has happened; A and B are independent means that

P(BA)=P(B)   or equivalently,   P(AB)=P(A)P(B)

Now suppose you have three events A,B, and C, and suppose each pair of them is independent by the defintion above. That is, A is independent of B, A is independent of C, and B is independent of C.

This is called pairwise independence, and you might be tempted to use it as a definition of independence of three events. But it doesn't quite work.

In a group of three people, let Bij be the event that Persons i and j have the same birthday, and let B123 be the event that all three have the same birthday. Under the assumptions of randomness that we made for the classical Birthday Problem, we know that

P(B12)=1365=P(B23)=P(B13)

and P(B12B23)=P(B123)=13651365=P(B12)P(B23)

So B12 and B23 are independent. In the same way you can show that B12 and B13 are independent, as are B23 and B13. Thus the three events B12, B13, and B23 are pairwise independent.

But P(B13B12B23)=1P(B13)

Given that Persons 1 and 2 have the same birthday and that Persons 2 and 3 have the same birthday, there is no randomness left in whether Persons 1 and 3 have the same birthday – they just do. Information about the other two pairs affects the chance of B13.

This goes against what independence should mean, and points out to us that when we have more than two events we have to be careful about the defining independence.

Mutual Independence

Events A1,A2,An are mutually independent (or independent for short) if given that any subset of the events has occurred, the conditional chances of all other subsets remain unchanged.

That's quite a mouthful. In practical terms it means that it doesn't matter which of the events you know have happened; chances involving the remaining events are unchanged.

In terms of random variables, X1,X2,,Xn are independent if given the values of any subset, chances of events determined by the remaining variables are unchanged.

In practice, this just formalizes statements such as "results of different tosses of a coin are independent" or "draws made at random with replacement are independent".

Try not to become inhibited by the formalism. Notice how the theory not only supports intuition but also develops it. You can expect your probabilistic intuition to be much sharper at the end of this course than it is now!

 

results matching ""

    No results matching ""