Much of applied econometric analysis are interested in “explaining y in terms of x” and confront three issues: 1) Since there is never an exact relationship between y and x, how do we account for the “other unobserved” variables? 2) What is the function relationship between y and x? 3) How do we invoke a ceteris paribus relationship or a causal effect between y an x? The simple linear regression model is: y = ?0 + ?1x + u -? “u” is the stochastic error or disturbance term and represents all those unobserved factors or other factors other than x. -? If all other actors (u) is held fixed, so that change in u is zero, we can observe the function relationship between y and x. -? If we take the expected value of the model, (? u = 0 and ?? 0), then we can see that x has a linear effect on y. We will only get reliable estimates of ?0 and ?1 if we make restricting assumptions on u. As long as ?0 is included in the model, nothing is lost by making the assumption that the expected value of u in the population is zero; E(u) = 0. ZCM Our crucial assumption is by defining the conditional distribution of u given any value of x. This crucial assumption is, the average value of u does not depend n the value of x. E(u|x) = E(u) = 0 This is the zero-conditional mean assumption (ZCM) -? The average value of the unobserved factors is the same across the population. -? An important implication of ZCM is that u and x are uncorrelated. OLS Ordinary Least Squares (OLS) is a method for estimating the unknown parameters in a linear regression model. The estimates for ?0 and ?1 are found by minimizing the sum of squared residuals. That is, the distance between the observations in the sample and the responses predicted. -? Fitted values and estimates are denoted by a HAT -? The values predicted for y when x = xi (where xi = bservation i) are called the fitted values -? ^yi. -?

There is a fitted value for every observation in the sample -? The residual for each observation is given by the difference between the actual, yi, and its fitted value ^yi. -? The OLS regression line is also called the sample regression function (SRF). Properties of OLS • Residuals -? If ^ui (residual associated with observation i) is positive, line underpredicts y -? If ^ui is negative, the line overpredicts y -? Sum (or sample average) of OLS residuals is zero -? Covariance between regressors and OLS residuals is zero • Variation Total Sum of Squares (SST) = Total ample variation in the yi Explained sum of squares (SSE) = Total sample variation in the ^yi Residual sum of squares (SSR) = Sample variation in the ^ui The total variation in y can thus be expressed as: SST = SSE + SSR ? Goodness-? Of-? Fit (R2) The R-? squared is the ratio of explained variation to total variation. Thus interpretation is the fraction of sample variation in y that is explained by x. R2 = SSE / SST -? The R-? squared of the regression is sometimes referred to as the coefficient of determination. -? The percentage of sample variation in y that is explained by x. Units of Measurement & Functional Form Sometimes inear relationships between the dependent and independent variables are not appropriate for all economic application. The different types of functional forms are: -? LEVEL LEVEL: y and x. One unit increase in x increases y by ?1 -? LEVEL LOG: y and log(x). -?