Benefits from uncorrelated explanatory variables

Uncorrelated explanatory variables are called orthogonal. We mentioned earlier that orthogonal explanatory variable result in more accurate estimates of the two slope parameters. With orthogonal explanatory variables,

A further benefit from orthogonal explanatory variables is that the least squares slopes are easier to interpret:

Finally, the two sequential anova tables contain the same sums of squares when X and Z are orthogonal.

With orthogonal explanatory variables therefore, a single anova table is sufficient and we do not need to worry about the order of adding the explanatory variables,

Note that we can use the two F ratios in this table to test the significance of the two explanatory variables in either order. (If the explanatory variables had been correlated, the upper F ratio should only be tested after you concluded that the other variable could be omitted from the model.)

Illustrations

The diagram below shows simulated data from a regression model. The slider changes the values of the explanatory variables (X and Z) to adjust their correlation.

Observe initially that the slope coefficients for X are different in the least squares fit of the full model and that of the model with only X. Use the slider to make X and Z orthogonal and observe that the two slope coefficients for X are now equal.


The next simulation is similar but shows the two sequential sums of squares tables corresponding to the two orders of adding X and Z.

Observe initially that the sums of squares in the two tables are different. Use the slider to make X and Z orthogonal and observe that the sums of squares in the tables are now equal.