Orthogonalityoccurs when two things can vary independently, they are uncorrelated, or they are perpendicular.

When performing statistical analysis, independent variables that affect a particular dependent variable are said to be orthogonal if they are uncorrelated, since the covariance form an inner product. In this case the same results are obtained for the effect of any of the independent variables upon the dependent variable, regardless of whether one models the variables' effects individually with simple regression or simultaneously with multiple regression. If correlation is present, the factors are not orthogonal and different results are obtained by the two methods. This usage arises from the fact that if centered (by subtracting the expected value (the mean)), uncorrelated variables are orthogonal in the geometric sense discussed above, both as observed data (i.e. vectors) and as random variables (i.e. density functions). In particular, the Ordinary Least Squares estimator may be easily derived from an orthogonality condition between dependent variables and model residuals.

Uses of Factor Analysis