The expectation represents a central value of a random variable and has a measure theory counterpart as a Lebesgue-Stieltjes integral of with respect to a (probability) measure . This kind of integration is defined in steps. First it is shown the integration of simple functions and then extended to more general random variables. In general, let’s consider a probability space and a random variable such that where . Then, the expectation of is denoted as: as the Lebesgue-Stieltjes integral of with respect to the (probability) measure .
Simple functions
In general a random variable is simple if it has a finite range. Let’s consider a probability space and consider a -measurable simple function , i.e. where and are a disjoint partition of the sample space, i.e. . Let’s denote the set of all simple functions on as . In this settings, is a vector space. This implies that he following two properties holds.
Constant: given a simple function , then . In fact: where .
Linearity: given two simple function , then . In fact: where the sequence of sets form a disjoint partition of .
Product: given two simple function , then . In fact:
Measurability
Simple functions are the building blocks in the definition of the expectation in terms of Lebesgue-Stieltjes integral. In fact a known theorem called Measurability theorem shows that any measurable function can be approximated by a sequence of simple functions.
Theorem 5.1 Suppose that for all . Then, is measurable if and only if there exists simple functions and
Expectation of Simple Functions
The expectation of a simple function is defined as: where .
Properties
- Non-negativity: If and then
Proof. By definition of simple functions
- Linearity: the expectation of simple function is linear, i.e.
Proof. Let’s consider two simple functions, i.e. and let’s fix . Then, by the second property of the vector space (Equation 5.3) it is possible to write: Then, taking the expectation on both sides:
Fixing , the sequence for is composed by disjoint events since by definition are disjoint. Hence, applying -additivity it is possible to write: Therefore, the expectation simplifies in:
Review of inequalities
Modulus inequality
Definition 5.1 ()
Let’s consider a random variable , where stands for the set of integrable random variables, i.e. Then, the modulus inequality states that:
Markov inequality
Definition 5.2 ()
Let’s consider a random variable and fix a , then by the Markov inequality:
Chebychev inequality
Definition 5.3 ()
Consider a random variable with first and second moment finite, i.e. then by the Chebychev inequality:
Holder inequality
Definition 5.4 ()
Let’s consider two numbers and such that and let’s consider two random variables and such that: Then, In terms of norms:
Schwartz inequality
Definition 5.5 ()
Consider two random variables , i.e. with first and second moment finite, i.e. Then In terms of norms: Note that this is a special case of Holder inequality (Equation 5.6) with .
Minkowski inequality
Definition 5.6 ()
For let’s consider two random variables , then and
Note that the triangular inequality is a special case of Minkowski inequality with , i.e.
Jensen inequality
Definition 5.7 ()
Let’s consider a convex function . Suppose that and , then if is concave the results revert, i.e.