Sunday, 4 September 2011

Key Elements and Applications of Factor Analysis

Orthogonality occurs when two things can vary independently, they are uncorrelated, or they are perpendicular.

When performing statistical analysis, independent variables that affect a particular dependent variable are said to be orthogonal if they are uncorrelated, since the covariance form an inner product. In this case the same results are obtained for the effect of any of the independent variables upon the dependent variable, regardless of whether one models the variables' effects individually with simple regression or simultaneously with multiple regression. If correlation is present, the factors are not orthogonal and different results are obtained by the two methods. This usage arises from the fact that if centered (by subtracting the expected value (the mean)), uncorrelated variables are orthogonal in the geometric sense discussed above, both as observed data (i.e. vectors) and as random variables (i.e. density functions). In particular, the Ordinary Least Squares estimator may be easily derived from an orthogonality condition between dependent variables and model residuals.


Uses of Factor Analysis

Analyzing intelligence

Cornell University researcher Richard Darlington reports that psychologist Charles Spearman invented factor analysis to show how all our mental abilities---imagination, creativity, logic, memory, mathematic ability and so on---were controlled by one core variable representing our essential intelligence. Science discredited Spearman's particular theory---though not his statistical approach---but other researchers have continued efforts to reduce our mental operations to a few core factors.


In 1986, Darlington says, psychologist Amy Rubenstein surveyed junior high-school students about their interest in things, such as trying new food or figuring out how machines work. Using factor analysis, Rubenstein found seven core factors she said can be used to calculate people's level of curiosity: Interest in natural science, interest in art, interest in new experiences, enthusiasm for reading, love of learning, enjoyment of problem solving, and low interest in money.


Duke University chemists Charles E. Reese and C. H. Lochm├╝ller say factor analysis is valuable for figuring out the controlling variables in different chemical tests, such as what makes different quantities of different solutions absorb different amounts of ultraviolet light. Another example would be scientists trying to find the factors that affect how different compounds in different solutions react in chromatographic experiments.


An article in the Southern Journal of Economics reports that factor analysis has been used to study modernization in India and price analysis in business, and to review market studies of the Midwestern dairy industry. Other researchers have used factor analysis to study rising interest in tech stocks in Japan and student attitudes toward economics education.

Political Science

R.J. Rummel of the University of Hawaii says that factor analysis is helpful to analyze the many variables in international conflict and peacemaking. For example, factor analysis can help make sense of the interaction between variables, such as various nations' defense budgets, their relationship with the United States, their international trade and the stability of the governments involved.

Apart from above mentioned uses factor analysis is also widely used in the fields of Agriculture, Defence And understanding the impacts of various factors such as interest Rates on the Economy.


Miscellaneous Other Issues and Statistics:

Factor Scores: We can estimate the actual values of individual cases (observations) for the factors. These factor scores are particularly useful when one want to perform further analyses involving the factors that you have identified in the factor analysis.

Reproduced and Residual Correlations: An additional check for the appropriateness of the respective number of factors that were extracted is to compute the correlation matrix that would result if those were indeed the only factors. That matrix is called the reproduced correlation matrix. To see how this matrix deviates from the observed correlation matrix, you can compute the difference between the two; that matrix is called the matrix of residual correlations. The residual matrix may point to "misfits," that is, to particular correlation coefficients that cannot be reproduced appropriately by the current number of factors.


Author: Aditya Mandloi (13122)

Finance_Group 5

No comments:

Post a Comment