Independent Variables are characteristics that can be measured directly (example the area of a house). These variables are also caled predictor variables (used to predict the dependent variable) or explanatory variables (used to explain the behavior of the dependent variable).
Dependent variable is a characteristic whose value depends on the values of independent variables.
Y (Dependent Variable) = B0 (Constant Term - Intercept) + B1 (Slope coefficient)* X 1 (Independent Variable) + E (error / Regidual)
Here is an example of Simple Linear Regression.
Y=1636.415+1.487X
The slope of 1.487 means that for each increase of one unit in X, we predict the average of Y to increase by an estimated 1.487 units.
The equation estimates that for each increase of 1 square foot in the size of the store, the expected annual sales are predicted to increase by $1487.
Calculations - Example 2 Here we give one independent variable (X) and one dependent variable (Y)
But how much of the variance in Y (and thus in SST) can be explained by changes in the values of X (“SSR”), and how much is just due to “random error (“SSE”)?
SST (Total Sample Variability,Total Sum of Squares ) = SSR (Explained Variability,Regression Sum of Squares ) + SSE (Unexpained Variability, Error Sum of Squares)
Coefficient of Determination (r2) = SSR/SST= 183.333/187.333 = .9786
0< r2 <1; the closer to 1, the better the fit
Correlation Coefficient
r = (sign of b1)* Sqrt( r2) =+0.9892
|