Definition 4.1 ()
Given a probability space , two events are said to be independent if: A finite sequence of events, namely , is said to be independent, if for all and we have:
Consider the probability space , where and for every we have a constant probability . Consider the events and , are these events independent? Note that the events have probabilities , . Consider all the events , then the intersection of those sets gives that has probability . Then we can compute the product of the probabilities of the single events: Hence the events are . Consider now only the events , the probability of the joint set, namely , is . However the product of the probabilities of the single events gives a different result: Hence the events are .
Proposition 4.1 ()
If two events and are independent, then also are and , and , and are independent.
Proof. If two events and are independent, then also are and , and , and . In order to prove that and are independent let’s write the event as union of disjoint events (Equation 1.9). Then since and are assumed to be independent: Recovering one obtain: The same follows for and . Now let’s consider the case of and . Using the same trick done previously . Since we have already proven that and are independent, we can write: Recovering one obtain: