
- •#3. Role of Econometrics.
- •#4. Main Application of Econometrics.
- •#5. Cross Section Data.
- •#8. Panel or Longitudinal Data.
- •#12. Simple Regression Model (srm)
- •# 13. Simple Linear Model (slm)
- •#18. What is rss, ess, tss?
- •#25. The Random Components of the Regression Coefficients.
- •#26. The Gauss – Markov Theorem.
- •Indeed there have to be satisfied four conditions, known as the Gauss-Markov conditions, if ordinary least squares regression analysis is to give the best possible result.
- •#27. Unbiasedness of the Regression Coefficients.
- •#28. Precision of the Regression Coefficients.
- •#29. Testing Hypotheses Relating to the Regression Coefficients.
- •It has to be estimated by the standard error of b2, given by
- •#30. Confidence Intervals.
- •#35. What Can You Do about Heteroscedasticity?
- •#38. What Can You Do about Autocorrelation?
- •#45. Тhе apt model.
#25. The Random Components of the Regression Coefficients.
A least squares regression coefficient is a special form of random variable whose properties depend on those of the disturbance term in the equation. Suppose that Y depend on X accordingly to the relationship Yi=ß1+ß2Xi +Ui and we are fitting the regression equation given a sample of n observations Ŷi=b1+b2Xi
We shall also continue to assume that X is a nonstochastic exogenous variable.
Its value in each observation may be considered to be predetermined by factors unconnected with the present relationship. Note that Yi has two components.
It has a nonrandom component (ß1+ß2Xi) which owes nothing to the laws of chance (ß1 and ß2 may be unknown but they are fixed constants), and it has the random component Ui.
We
can calculate b2 according to the usual formula b2=
B2
also has a random component Cov(X,Y) depends on the values of Y and
the values of Y depend on the values of
.
If the values of the disturbance term had been different in the n observations, we would have obtained different values of Y, hence difference values of cov(X,Y)and hence different values of b2.
We can in theory decompose b2 into its nonrandom and random components
Cov(X,Y)=Cov(X,[ ß1+ß2X+u])=cov (x, ß1)+cov(x, ß2X)+cov(X,u)
ß1=const and ß2 =const
By covariance rule , cov(X, ß) must be equal to 0.
Cov(X, ß2x) is equal to ß2Cov(X,X)
Cov(X,X) is the same as Var(X)
Hence we can write
Cov(X,Y)= ß2var(x)+cov(x,u)
And
so b2=
=b2
+
Thus we have shown that the regression coefficient b2 obtained from
any sample consists of a fixed component equal to the true value, ß2
and a random component dependent on cov(X,U) which is responsible
for its variations around this central tendency. One may easily show
that b1 has a fixed component equal to the true value ß1 plus a
random component that depends on the random factor u.
#26. The Gauss – Markov Theorem.
We shall continue to work with the simple regression model where Y depends on X according to the relationship Yi=ß1+ß2Xi +Ui and we are fitting the regression equation given as a sample of n observations Ŷi=b1+b2Xi
The properties of the regression coefficient depend critically on the properties of the disturbance term.
Indeed there have to be satisfied four conditions, known as the Gauss-Markov conditions, if ordinary least squares regression analysis is to give the best possible result.
Condition 1: E(Ui)=0 for all observations. The first condition is the expected value of the disturbance term is any observation should be 0.
Sometimes it will be positive, sometimes negative, but It should not have a systematic tendency in either direction.
Condition
2: Population Var of Ui constant for all observations
=
for
all i.
The second condition is that the population variance of the disturbance term should be constant for all observations.
Sometimes the disturbance term will be greater, sometimes smaller, but there should not be any a priori reason for it to be more irregular in some observations than in others. The constant is usually denoted . Since E(ui)is 0, the population vatiance of Ui is equal to E(ui2), so the condition can also be written E(ui2)= for all i.
of
course is unknown. One of the tasks of regression analysis is to
estimate the standard deviation of the disturbance term.
Condition 3: ui distributed independently of uj (u j)
This condition states that there should be no systematic association between the values of the disturbance term in any two observations.
The
condition implies that
the population covariance between Ui and Uj is 0 because
=E[(ui-
)(Uj-
)]=E(Ui;Uj)=E(Ui)E(Uj)=0
Condition 4: u distributed independently of the explanatory variables.
The population covariance between the explanatory variable and the disturbance term is 0.
Since E(Ui) is 0, and the term involving x is nonstochastic.
=E[{Xi-E(Xi)}{Ui- }]=(Xi-Xi)E(Ui)=0