econometrics chapter 2 summary simple regression model
SUMMARY
We have introduced the simple linear regression model in this chapter, and we have covered
its basic properties. Given a random sample, the method of ordinary least squares
is used to estimate the slope and intercept parameters in the population model. We have
demonstrated the algebra of the OLS regression line, including computation of fitted
values and residuals, and the obtaining of predicted changes in the dependent variable
for a given change in the independent variable. In Section 2.4, we discussed two issues
of practical importance: (1) the behavior of the OLS estimates when we change the
units of measurement of the dependent variable or the independent variable; (2) the use
of the natural log to allow for constant elasticity and constant semi-elasticity models.
In Section 2.5, we showed that, under the four Assumptions SLR.1 through SLR.4,
the OLS estimators are unbiased. The key assumption is that the error term u has zero
mean given any value of the independent variable x. Unfortunately, there are reasons to
think this is false in many social science applications of simple regression, where the
omitted factors in u are often correlated with x. When we add the assumption that the
variance of the error given x is constant, we get simple formulas for the sampling variances
of the OLS estimators. As we saw, the variance of the slope estimator ˆ
1 increases
as the error variance increases, and it decreases when there is more sample variation in
the independent variable. We also derived an unbiased estimator for
2 Var(u).
In Section 2.6, we briefly discussed regression through the origin, where the slope
estimator is obtained under the assumption that the intercept is zero. Sometimes this is
useful, but it appears infrequently in applied work.
Much work is left to be done. For example, we still do not know how to test
hypotheses about the population parameters, 0 and 1. Thus, although we know that
OLS is unbiased for the population parameters under Assumptions SLR.1 through
SLR.4, we have no way of drawing inference about the population. Other topics, such
as the efficiency of OLS relative to other possible procedures, have also been omitted.
The issues of confidence intervals, hypothesis testing, and efficiency are central to
multiple regression analysis as well. Since the way we construct confidence intervals
and test statistics is very similar for multiple regression—and because simple regression
is a special case of multiple regression—our time is better spent moving on to multiple
regression, which is much more widely applicable than simple regression. Our
purpose in Chapter 2 was to get you thinking about the issues that arise in econometric
analysis in a fairly simple setting.
List of key terms
Coefficient of Determination
Constant Elasticity Model
Control Variable
Covariate
Degrees of Freedom
Dependent Variable
Elasticity
Error Term (Disturbance)
Error Variance
Explained Sum of Squares (SSE)
Explained Variable
Explanatory Variable
First Order Conditions
Fitted Value
Heteroskedasticity
Homoskedasticity
Independent Variable
Intercept Parameter
Ordinary Least Squares (OLS)
OLS Regression Line
Population Regression Function (PRF)
Predicted Variable
Predictor Variable
Regressand
Regression Through the Origin
Regressor
Residual
Residual Sum of Squares (SSR)
Response Variable
R-squared
Sample Regression Function (SRF)
Semi-elasticity
Simple Linear Regression Model
Slope Parameter
Standard Error of beta one
Standard Error of the Regression (SER)
Sum of Squared Residuals
Total Sum of Squares (SST)
Zero Conditional Mean Assumption
We have introduced the simple linear regression model in this chapter, and we have covered
its basic properties. Given a random sample, the method of ordinary least squares
is used to estimate the slope and intercept parameters in the population model. We have
demonstrated the algebra of the OLS regression line, including computation of fitted
values and residuals, and the obtaining of predicted changes in the dependent variable
for a given change in the independent variable. In Section 2.4, we discussed two issues
of practical importance: (1) the behavior of the OLS estimates when we change the
units of measurement of the dependent variable or the independent variable; (2) the use
of the natural log to allow for constant elasticity and constant semi-elasticity models.
In Section 2.5, we showed that, under the four Assumptions SLR.1 through SLR.4,
the OLS estimators are unbiased. The key assumption is that the error term u has zero
mean given any value of the independent variable x. Unfortunately, there are reasons to
think this is false in many social science applications of simple regression, where the
omitted factors in u are often correlated with x. When we add the assumption that the
variance of the error given x is constant, we get simple formulas for the sampling variances
of the OLS estimators. As we saw, the variance of the slope estimator ˆ
1 increases
as the error variance increases, and it decreases when there is more sample variation in
the independent variable. We also derived an unbiased estimator for
2 Var(u).
In Section 2.6, we briefly discussed regression through the origin, where the slope
estimator is obtained under the assumption that the intercept is zero. Sometimes this is
useful, but it appears infrequently in applied work.
Much work is left to be done. For example, we still do not know how to test
hypotheses about the population parameters, 0 and 1. Thus, although we know that
OLS is unbiased for the population parameters under Assumptions SLR.1 through
SLR.4, we have no way of drawing inference about the population. Other topics, such
as the efficiency of OLS relative to other possible procedures, have also been omitted.
The issues of confidence intervals, hypothesis testing, and efficiency are central to
multiple regression analysis as well. Since the way we construct confidence intervals
and test statistics is very similar for multiple regression—and because simple regression
is a special case of multiple regression—our time is better spent moving on to multiple
regression, which is much more widely applicable than simple regression. Our
purpose in Chapter 2 was to get you thinking about the issues that arise in econometric
analysis in a fairly simple setting.
List of key terms
Coefficient of Determination
Constant Elasticity Model
Control Variable
Covariate
Degrees of Freedom
Dependent Variable
Elasticity
Error Term (Disturbance)
Error Variance
Explained Sum of Squares (SSE)
Explained Variable
Explanatory Variable
First Order Conditions
Fitted Value
Heteroskedasticity
Homoskedasticity
Independent Variable
Intercept Parameter
Ordinary Least Squares (OLS)
OLS Regression Line
Population Regression Function (PRF)
Predicted Variable
Predictor Variable
Regressand
Regression Through the Origin
Regressor
Residual
Residual Sum of Squares (SSR)
Response Variable
R-squared
Sample Regression Function (SRF)
Semi-elasticity
Simple Linear Regression Model
Slope Parameter
Standard Error of beta one
Standard Error of the Regression (SER)
Sum of Squared Residuals
Total Sum of Squares (SST)
Zero Conditional Mean Assumption
Comments
Post a Comment