4 Independence
Reference: Chapter 4. Resnick (2005).
Definition 4.1 (\(\color{magenta}{\textbf{Independent events}}\))
Let’s consider a probability space \((\Omega, \mathcal{B}, \mathbb{P})\) and two events \(A, B \in \mathcal{B}\). The events \(A\) and \(B\) are said to be independent if \[
A \perp B \iff
\mathbb{P}(A {\color{blue}{\cap}} B) = \mathbb{P}(A) \mathbb{P}(B)
\text{.}
\tag{4.1}\] Similarly, a finite sequence of events \(A_1, A_2, \dots , A_n \in \mathcal{B}\), is said to be independent, if for all \(2 \le j \le n\) and \(1 \le k_1 \le k_2 \le \dots \le k_j \le n\) the following factorization holds \[
A_{k_1} \perp A_{k_2} \perp \dots \perp A_{k_n} \iff
\mathbb{P}(A_{k_1} {\color{blue}{\cap}} A_{k_2} {\color{blue}{\cap}} \dots {\color{blue}{\cap}} A_{k_n}) = \prod_{j = 1}^{n} \mathbb{P}(A_{k_j})
\text{.}
\]
Example 4.1 Consider the probability space \((\Omega, \mathcal{P}(\Omega), \mathbb{P})\), where \(\Omega = \{1,2,3,4,5,6\}\) and for every \(\omega_i \in \Omega\) we have a constant probability \(\mathbb{P}(\omega_i) = \frac{1}{6} \; \; \forall i\). Consider the events \(A_1 = \{1,2,3,4\}\) and \(A_2 = A_3 = \{4,5,6\}\), are these events independent? Note that the events have probabilities \(\mathbb{P}(A_1) = \frac{2}{3}\), \(\mathbb{P}(A_2) = \mathbb{P}(A_3) =\frac{1}{2}\). Consider all the events \(A_1, A_2, A_3\), then the intersection of those sets gives \([A_1 {\color{blue}{\cap}} A_2 {\color{blue}{\cap}} A_3] = \{4\}\) that has probability \(\mathbb{P}(\{4\}) = \frac{1}{6}\). Then we can compute the product of the probabilities of the single events: \[ \frac{1}{6}= \mathbb{P}([A_1 {\color{blue}{\cap}} A_2 {\color{blue}{\cap}} A_3]) = \mathbb{P}(A_1) \mathbb{P}(A_2) \mathbb{P}(A_3) = \frac{2}{3} \frac{1}{2} \frac{1}{2} = \frac{1}{6} \text{.} \] Hence the events \(A_1 {\color{blue}{\cap}} A_2 {\color{blue}{\cap}} A_3\) are independent. Consider now only the events \(A_2 {\color{blue}{\cap}} A_3\), the probability of the joint set, namely \([ A_2 {\color{blue}{\cap}} A_3] = \{4, 5, 6\}\), is \(\mathbb{P}(\{4, 5, 6\}) = \frac{1}{2}\). However the product of the probabilities of the single events gives a different result: \[ \frac{1}{2}= \mathbb{P}([A_2 {\color{blue}{\cap}} A_3]) \neq \mathbb{P}(A_2) \mathbb{P}(A_3) = \frac{1}{2} \frac{1}{2} = \frac{1}{4} \text{.} \] Thus, the events \(A_2\) and \(A_3\) are NOT independent.
Proposition 4.1 (\(\color{magenta}{\textbf{Independence and complementation}}\))
If two events \(A\) and \(B\) are independent, then also \(A\) and \(B^{\mathsf{c}}\), \(B\) and \(A^{\mathsf{c}}\), \(A^{\mathsf{c}}\) and \(B^{\mathsf{c}}\) are independent.
Definition 4.2 (\(\color{magenta}{\textbf{Independent classes}}\))
Let’s consider a probability space \((\Omega, \mathcal{B}, \mathbb{P})\) and let \(\mathcal{C}_i \in \mathcal{B}\) with \(i = 1, \dots, n\) be a classes of independent events. Then, for any choice of \(A_1, \dots, A_n\) with \(A_i \in \mathcal{C}_i\) and \(i = 1, \dots, n\), we have that the events \(A_1, \dots, A_n\) are independent according to the Definition 4.1.
4.1 Independent random variables
Theorem 4.1 (\(\color{magenta}{\textbf{Basic Criterion}}\))
Let’s consider a probability space \((\Omega, \mathcal{B}, \mathbb{P})\) and let \(\mathcal{C}_i\) with \(i = 1, \dots, n\) be a non-empty class of events such that
- \(\mathcal{C}_i\) is a \(\pi\)-system (see here);
- \(\mathcal{C}_i\) with \(i = 1, \dots, n\) are independent according to Definition 4.2.
Then, the \(\sigma\)-fields \(\sigma(\mathcal{C}_1), \dots \sigma(\mathcal{C}_n)\) are independent.
Definition 4.3 (\(\color{magenta}{\textbf{Independent random variables}}\))
Let’s consider a probability space \((\Omega, \mathcal{B}, \mathbb{P})\) and let \(\{X_t\}_{t \in \mathcal{T}}\) be a family of independent random variables. Then, the family of sigma fields \[
\{\sigma(X_t)\}_{t \in \mathcal{T}}
\text{,}
\] are independent.
The result in Theorem 4.1 is very important to prove the independence between two or more random variables. In fact, according to Definition 4.3, to prove independence we should verify that all the \(\sigma\)-fields generated by \(X_t\) are independent. If one is able to prove that there exists a class of independent events \(\mathcal{C}_t\) that generate \(\sigma(X_t)\) and that is a \(\pi\)-system, then one can immediately conclude that the random variables are independent.
4.2 Independence theorems
Theorem 4.2 (\(\color{magenta}{\textbf{Borel-Cantelli Lemma}}\))
Let \(\{A_n\}\) be a sequence of any events. If \[
\sum_{n} \mathbb{P}(A_n) < \infty
\text{,}
\] then \[
\mathbb{P}(A_n, \text{ i.o.}) = \mathbb{P}(\underset{n\to\infty}{\text{limsup}} A_n) = 0
\text{.}
\]
Proposition 4.2 (\(\color{magenta}{\textbf{Borel-Zero-One Law}}\))
Let \(\{A_n\}\) be a sequence of independent events, then \[
\mathbb{P}(A_n, \text{ i.o.}) =
\begin{cases}
0 & \text{iff} \; \sum_{n} \mathbb{P}(A_n) < \infty \\
1 & \text{iff} \; \sum_{n} \mathbb{P}(A_n) = \infty
\end{cases}
\]