Definition 7.1 ()
Consider an -dimensional vector , then the characteristic function is defined as:
The characteristic function always exists when treated as a function of a real-valued argument, unlike the moment-generating function. The characteristic function uniquely determines the probability distribution of the correspondent random vector . More precisely, saying that two random variables has the same distribution is equivalent to say that their characteristic functions are equal. It follows that we can always work under characteristic functions to prove that two distribution of some random vectors are equal or that a distribution converges to another distribution. Formally and ave same distribution, i.e.
Here, we list some properties considering the random variable case, i.e. , with .
Independence: and are independent iff:
Existence of the -th moment: If the -th moment of the random variable is finite then the characteristic function is -times differentiable and continuous in 0, i.e. . Formally, Note that, if is even, it became an if:
Inversion theorem: The characteristic function uniquely determines the probability distribution of the correspondent random vector : Then, the density function is obtained as:
Convergence in distribution:
Scaling and centering: Given , the effect of scaling and centering on the characteristic function is such that:
Weak Law of Large Numbers: consider a sequence of IID random variables , such that exists the first derivative of the characteristic function in zero, namely and the first moment of is finite, i.e. , then the sample mean converges in probability to a degenerate random variable . for some .
Proof. To prove the above statement, let’s compute the characteristic function of : Let’s now apply the Taylor series of a function around the point (Equation 30.1) to expand till the first order term the function around zero ( and ), i.e. The convergence going to the limit as follows from the fact that in general if , then the following limit holds: Therefore, since for some , it follows that: Hence, since is the characteristic function of a degenerate random variable, namely a random variable that is constant almost surely, it is possible to conclude that the sample mean converges in distribution to a degenerate random variable . Moreover, in this specific case in which the limit is a degenerate random variable it can be proved that having convergence in distribution implies also convergence in probability, something that in general is not true.
Moment generating function
Definition 7.2 ()
Consider an uni dimensional random variable , then the moment generating function is defined as:
Proposition 7.1 ()
Consider a random variable , such that it’s moment generating function exists and it’s finite around zero, i.e. Then this implies that the sequence of moments are finite for all and the sequence of moments uniquely determine the distribution of . According to this result, if we consider another random variable such that for all , then the distribution of and is the same, i.e. .