The Classical Linear Regression Model (CLRM) correct answers Assumptions:
The model is linear in the parameters
The explanatory variables X are uncorrelated with the error term
The previous assumption leads to E(e|X)=0
The model is correctly specified (i.e. there is no specification bias). This means that we included
all relevant variables
There is no correlation between two error terms. i.e. cov(ui, uj)=0
u ~ N(0, s2).
Homoskedasticity correct answers refers to the assumption that that the dependent variable
exhibits similar amounts of variance across the range of values for an independent variable
Gauss-Markov Theorem correct answers Given the assumptions of the CLRM, the OLS
estimators have minimum variance in the class of linear estimators. That is, they are BLUE (best
linear unbiased estimators).
Properties of an OLS correct answers Assumptions:
b1 and b2 are linear estimators; that is, they are linear functions for the random variable Y.
They are unbiased, thus E(b)=b.
The variance of the estimators is also unbiased.
b1 and b2 are efficient estimators; that is, the variance of each estimator is less than the variance
of any of the linear unbiased estimator.
t test correct answers Degrees of freedom: This test has a distribution t with n-2 degrees of
freedom.
The level of significance a.
One tail or two tails?
r2 correct answers measures the proportion of the total variation in Y explained by the regression
model
The Method of Least Squares correct answers estimates the regression parameters in such a way
that the sum of the squared difference between the actual Y values(i.e. the value dependent on
the variable) and the estimated Y values is as small as possible.
OLS Estimator correct answers The estimators od the regression parameters obtained by the
method of least squares.
The variance of an estimator correct answers An estimator being a random variable, its variance,
like the variance of any random variable, measures the spread of the estimated values around the
mean value of the estimator.