References: Chapter 4. Gardini A. (2000).
A general framework for linear restrictions
Let’s consider a generic uni-variate linear model with -regressors, namely and suppose that we are interest in testing if the coefficient is statistically different from a certain known value . In this case the null hypothesis, that is , can be equivalently represented using a more flexible matrix notation, i.e. where Hence, the linear restriction in matrix form reads explicitly as
Multiple restrictions
Let’s consider a linear model of the form and suppose that the aim is to test at the same time the following null hypothesis, i.e. Let’s construct the vector for (1) (first column of ) and (2) (first column of ), i.e.
Restricted least squares
Proposition 17.1 Let’s consider a linear model under the OLS assumptions and let’s consider a set of linear hypothesis on the parameters of the model taking the form Therefore, the optimization problem became restricted to the space of parameters that satisfies the conditions. More precisely, such parameters will be in a subset of the parameter space, i.e. , where the linear constraint holds true. Formally, the space is defined as Hence, the optimization problem in Equation 15.2 is restricted to only the parameters that satisfy the constraint. Formally, the RLS estimator is the solution of the following minimization problem, i.e. where reads as in the OLS case (Equation 15.1). Notably, the analytic solution for reads
Proof. In order to solve the minimization problem in Equation 17.1, let’s construct the Lagrangian , i.e. where is the vector of the Lagrange multipliers. Minimizing is equivalent to find the value of that minimize under the constraint . In fact, it is possible to prove that the minimum is found as: In the case of RLS estimate the Lagrangian reads: Then, from Equation 17.3 one obtain the following system of equation, i.e. Let’s explicit from (A), i.e. Let’s now substitute Equation 17.4 in (B), i.e. Hence, it is possible to explicit the Lagrange multipliers as: Finally, substituting Equation 17.5 in Equation 17.4 gives the optimal solution, i.e. Note that if constraints hold true in the OLS estimate, is true and therefore . Hence the RLS and OLS parameters are the same, i.e. .
Properties RLS
- The RLS estimator is correct if and only if the restriction imposed by is true in population. In fact, it’s expected value is computed as: and it is correct if and only if the second component is zero, i.e. if holds true.
Proof. Let’s apply the expected value on Equation 17.2 remembering that , and are non-stochastic and that is correct (Equation 15.8). Developing the computations gives: Hence is correct if and only if the restriction holds in population.
A test for linear restrictions
Under the assumption of normality of the error terms, it is possible to derive a statistic to test the significance of the linear restrictions imposed by . Let’s test the validity of the hull hypothesis against the alternative hypothesis , i.e. Under normality, the OLS estimate are multivariate normal, thus applying the scaling property one obtain that the distribution under is normal, i.e. Applying the relation between the distribution of the quadratic form of a multivariate normal and the distribution from property 3 in Section 35.1.2, one obtain the statistic: Under , is distributed as a , where is the number of linear restrictions, i.e.
As general decision rule is rejected if the statistic in Equation 17.8 is greater than the quantile with confidence level of a random variable. Such critic value, denoted with represents the value for which the probability that a is greater than the value is exactly , i.e. In this case the probability to have an error of type I, i.e. rejecting when is true is exactly .
Instead, by applying property 4. in Section 35.1.2, under the statistic is distributed as a non central , i.e. where the non centrality parameter is computed as:
Gardini A., Costa M., Cavaliere G. 2000. Econometria, Volume Primo. FrancoAngeli.