
- •Preface
- •Part IV. Basic Single Equation Analysis
- •Chapter 18. Basic Regression Analysis
- •Equation Objects
- •Specifying an Equation in EViews
- •Estimating an Equation in EViews
- •Equation Output
- •Working with Equations
- •Estimation Problems
- •References
- •Chapter 19. Additional Regression Tools
- •Special Equation Expressions
- •Robust Standard Errors
- •Weighted Least Squares
- •Nonlinear Least Squares
- •Stepwise Least Squares Regression
- •References
- •Chapter 20. Instrumental Variables and GMM
- •Background
- •Two-stage Least Squares
- •Nonlinear Two-stage Least Squares
- •Limited Information Maximum Likelihood and K-Class Estimation
- •Generalized Method of Moments
- •IV Diagnostics and Tests
- •References
- •Chapter 21. Time Series Regression
- •Serial Correlation Theory
- •Testing for Serial Correlation
- •Estimating AR Models
- •ARIMA Theory
- •Estimating ARIMA Models
- •ARMA Equation Diagnostics
- •References
- •Chapter 22. Forecasting from an Equation
- •Forecasting from Equations in EViews
- •An Illustration
- •Forecast Basics
- •Forecasts with Lagged Dependent Variables
- •Forecasting with ARMA Errors
- •Forecasting from Equations with Expressions
- •Forecasting with Nonlinear and PDL Specifications
- •References
- •Chapter 23. Specification and Diagnostic Tests
- •Background
- •Coefficient Diagnostics
- •Residual Diagnostics
- •Stability Diagnostics
- •Applications
- •References
- •Part V. Advanced Single Equation Analysis
- •Chapter 24. ARCH and GARCH Estimation
- •Basic ARCH Specifications
- •Estimating ARCH Models in EViews
- •Working with ARCH Models
- •Additional ARCH Models
- •Examples
- •References
- •Chapter 25. Cointegrating Regression
- •Background
- •Estimating a Cointegrating Regression
- •Testing for Cointegration
- •Working with an Equation
- •References
- •Binary Dependent Variable Models
- •Ordered Dependent Variable Models
- •Censored Regression Models
- •Truncated Regression Models
- •Count Models
- •Technical Notes
- •References
- •Chapter 27. Generalized Linear Models
- •Overview
- •How to Estimate a GLM in EViews
- •Examples
- •Working with a GLM Equation
- •Technical Details
- •References
- •Chapter 28. Quantile Regression
- •Estimating Quantile Regression in EViews
- •Views and Procedures
- •Background
- •References
- •Chapter 29. The Log Likelihood (LogL) Object
- •Overview
- •Specification
- •Estimation
- •LogL Views
- •LogL Procs
- •Troubleshooting
- •Limitations
- •Examples
- •References
- •Part VI. Advanced Univariate Analysis
- •Chapter 30. Univariate Time Series Analysis
- •Unit Root Testing
- •Panel Unit Root Test
- •Variance Ratio Test
- •BDS Independence Test
- •References
- •Part VII. Multiple Equation Analysis
- •Chapter 31. System Estimation
- •Background
- •System Estimation Methods
- •How to Create and Specify a System
- •Working With Systems
- •Technical Discussion
- •References
- •Vector Autoregressions (VARs)
- •Estimating a VAR in EViews
- •VAR Estimation Output
- •Views and Procs of a VAR
- •Structural (Identified) VARs
- •Vector Error Correction (VEC) Models
- •A Note on Version Compatibility
- •References
- •Chapter 33. State Space Models and the Kalman Filter
- •Background
- •Specifying a State Space Model in EViews
- •Working with the State Space
- •Converting from Version 3 Sspace
- •Technical Discussion
- •References
- •Chapter 34. Models
- •Overview
- •An Example Model
- •Building a Model
- •Working with the Model Structure
- •Specifying Scenarios
- •Using Add Factors
- •Solving the Model
- •Working with the Model Data
- •References
- •Part VIII. Panel and Pooled Data
- •Chapter 35. Pooled Time Series, Cross-Section Data
- •The Pool Workfile
- •The Pool Object
- •Pooled Data
- •Setting up a Pool Workfile
- •Working with Pooled Data
- •Pooled Estimation
- •References
- •Chapter 36. Working with Panel Data
- •Structuring a Panel Workfile
- •Panel Workfile Display
- •Panel Workfile Information
- •Working with Panel Data
- •Basic Panel Analysis
- •References
- •Chapter 37. Panel Estimation
- •Estimating a Panel Equation
- •Panel Estimation Examples
- •Panel Equation Testing
- •Estimation Background
- •References
- •Part IX. Advanced Multivariate Analysis
- •Chapter 38. Cointegration Testing
- •Johansen Cointegration Test
- •Single-Equation Cointegration Tests
- •Panel Cointegration Testing
- •References
- •Chapter 39. Factor Analysis
- •Creating a Factor Object
- •Rotating Factors
- •Estimating Scores
- •Factor Views
- •Factor Procedures
- •Factor Data Members
- •An Example
- •Background
- •References
- •Appendix B. Estimation and Solution Options
- •Setting Estimation Options
- •Optimization Algorithms
- •Nonlinear Equation Solution Methods
- •References
- •Appendix C. Gradients and Derivatives
- •Gradients
- •Derivatives
- •References
- •Appendix D. Information Criteria
- •Definitions
- •Using Information Criteria as a Guide to Model Selection
- •References
- •Appendix E. Long-run Covariance Estimation
- •Technical Discussion
- •Kernel Function Properties
- •References
- •Index
- •Symbols
- •Numerics

748—Chapter 39. Factor Analysis
The second type of indeterminacy index reports the minimum correlation between alternate estimates of the factor scores, r = 2r2 – 1 . The minimum correlation measure ranges from -1 to 1. High positive values are desirable since they indicate that differing sets of factor scores will yield similar results.
Grice (2001) suggests that values for r that do not exceed 0.707 by a significant degree are problematic since values below this threshold imply that we may generate two sets of factor scores that are orthogonal or negatively correlated (Green, 1976).
Validity, Univocality, Correlational Accuracy
Following Gorsuch (1983), we may define Rff as the population factor correlation matrix, Rss as the factor score correlation matrix, and Rfs as the correlation matrix of the known factors with the score estimates. In general, we would like these matrices to be similar.
The diagonal elements of Rfs are termed validity coefficients. These coefficients range from -1 to 1, with high positive values being desired. Differences between the validities and the multiple correlations are evidence that the computed factor scores have determinacies lower than those computed using the r -values. Gorsuch (1983) recommends obtaining validity values of at least 0.80, and notes that values larger than 0.90 may be necessary if we wish to use the score estimates as substitutes for the factors.
The off-diagonal elements of Rfs allow us to measure univocality, or the degree to which the estimated factor scores have correlations with those of other factors. Off-diagonal values of Rfs that differ from those in Rff are evidence of univocality bias.
Lastly, we obviously would like the estimated factor scores to match the correlations among the factors themselves. We may assess the correlational accuracy of the scores estimates by comparing the values of the Rss with the values of Rff .
From our earlier discussion, we know that the population correlation Rff = Wˆ ¢SWˆ . Rss may be obtained from moments of the estimated scores. Computation of Rfs is more complicated, but follows the steps outlined in Gorsuch (1983).
References
Akaike, H. (1987). “Factor Analysis and AIC,” Psychometrika, 52(3), 317–332.
Anderson, T. W. and H. Rubin (1956). “Statistical Inference in Factor Analysis,” in Neyman, J., editor,
Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, Volume V, 111-150. Berkeley and Los Angeles: University of California Press.
Bernaards, C. A., and R. I. Jennrich (2005). “Gradient Projection Algorithms and Software for Arbitrary Rotation Criteria in Factor Analysis”, Educational and Psychological Measurement, 65(5), 676-696.
Browne, M. W. (2001). “An Overview of Analytic Rotation in Exploratory Factor Analysis,” Multivariate Behavioral Research, 36(1), 111–150.
Browne, M. W. and R. Cudeck (1993). “Alternative ways of Assessing Model Fit,” in K. A. Bollen and J. S. Long (eds.), Testing Structural Equation Models, Newbury Park, CA: Sage.

References—749
Cudeck, R. and M. W. Browne (1983). “Cross-validation of Covariance Structures,” Multivariate Behavioral Research, 18, 147–167.
Dziuban, C. D. and E. C. Shirkey (1974). “When is a Correlation Matrix Appropriate for Factor Analysis,”
Psychological Bulletin, 81(6), 358–361.
Fabrigar, L. R., D. T. Wegener, R. C. MacCallum, and E. J. Strahan (1999). “Evaluating the Use of Exploratory Factor Analysis in Psychological Research,” Psychological Methods, 4(3), 272–299.
Glorfeld, L. W. (1995). “An Improvement on Horn’s Parallel Analysis Methodology for Selecting the Correct Number of Factors to Retain,” Educational and Psychological Measurement, 55(3), 377–393.
Gorsuch, R. L. (1983). Factor Analysis, Hillsdale, New Jersey: Lawrence Erlbaum Associates, Inc.
Green, B. F., Jr. (1969). “Best Linear Composites with a Specified Structure,” Psychometrika, 34(3), 301– 318.
Green, B. F., Jr. (1976). “On the Factor Score Controversy,” Psychometrika, 41(2), 263–266.
Grice, J. W. (2001). “Computing and Evaluating Factor Scores,” Psychological Methods, 6(4), 430–450.
Harman, H. H. (1976). Modern Factor Analysis, Third Edition Revised, Chicago: University of Chicago Press.
Harris, C. W. and H. F. Kaiser (1964). “Oblique Factor Analytic Solutions by Orthogonal Transformations,” Psychometrika, 29(4), 347–362.
Hendrickson, A. and P. White (1964). “Promax: A Quick Method for Rotation to Oblique Simple Structure,” The British Journal of Statistical Psychology, 17(1), 65–70.
Horn, J. L. (1965). “A Rationale and Test for the Number of Factors in Factor Analysis,” Psychometrika, 30(2), 179–185.
Hu, L.-T. and P. M. Bentler (1995). “Evaluating Model Fit,” in R. H. Hoyle (Ed.), Structural Equation Modeling: Concepts, Issues, and Applications, Thousand Oaks, CA: Sage.
Hu, L.-T. and P. M. Bentler (1999). “Cut-off Criteria for Fit Indexes in Covariance Structure Analysis: Conventional Criteria Versus New Alternatives,” Structural Equation Modeling, 6(1), 1–55.
Humphreys, L. G. and D. R. Ilgen (1969). “Note on a Criterion for the Number of Common Factors,” Educational and Psychological Measurement, 29, 571–578.
Humphreys, L. G. and R. G. Montanelli, Jr. (1975). “An Investigation of the Parallel Analysis Criterion for Determining the Number of Common Factors,” Multivariate Behavioral Research, 10, 193–206.
Ihara, M. and Y. Kano (1995). “A New Estimator of the Uniqueness in Factor Analysis,” Psychometrika, 51(4), 563-566.
Jackson, D. A. (1993). “Stopping Rules in Principal Components Analysis: A Comparison of Heuristical and Statistical Approaches,” Ecology, 74(8), 2204–2214.
Jennrich, R. I. (2001). “A Simple General Procedure for Orthogonal Rotation,” Psychometrika, 66(2), 289– 306.
Jennrich, R. I. (2002). “A Simple General Method for Oblique Rotation,” Psychometrika, 67(1), 7–20.
Johnson, R. A., and D. W. Wichern (1992). Applied Multivariate Statistical Analysis, Third Edition, Upper Saddle River, New Jersey: Prentice-Hall, Inc.
Jöreskog, K. G. (1977). “Factor Analysis by Least-Squares and Maximum Likelihood Methods,” in Statistical Methods for Digital Computers, K. Enslein, A. Ralston, and H. S. Wilf, (eds.), New York: John Wiley & Sons, Inc.
Kaiser, H. F. (1970). “A Second Generation Little Jiffy,” Psychometrika, 35(4), 401–415.
Kaiser, H. F. and J. Rice (1974). “Little Jiffy, Mark IV,” Educational and Psychological Measurement, 34, 111–117.

750—Chapter 39. Factor Analysis
Kano, Y. (1990). “Noniterative estimation and the choice of the number of factors in exploratory factor analysis,” Psychometrika, 55(2), 277–291.
Marsh, H. W., J. R. Balla and R. P. McDonald (1988). “Goodness of Fit Indexes in Confirmatory Factor Analysis: The Effect of Sample Size,” Psychological Bulletin, 103(3), 391–410.
McDonald, R. P. (1981). “Constrained Least Squares Estimators of Oblique Common Factors,” Psychometrika, 46(2), 277–291.
McDonald, R. P. and H. W. Marsh (1990). “Choosing a Multivariate Model: Noncentrality and Goodness of Fit,” Psychological Bulletin, 107(2), 247–255.
Preacher, K. J. and R. C. MacCallum (2003). “Repairing Tom Swift's Electric Factor Analysis Machine,”
Understanding Statistics, 2(1), 13–32.
Ten Berge, J. M. F., W. P. Krijnen, T. Wansbeek, and A. Shapiro (1999). “Some New Results on Correlation Preserving Factor Scores Prediction Methods,” Linear Algebra and Its Applications, 289, 311– 318.
Tucker, L. R, and R. C. MacCallum (1997). Exploratory Factor Analysis, Unpublished manuscript.
Velicer, W. F. (1976). “Determining the Number of Components from the Matrix of Partial Correlations,” Psychometrika, 41(3), 321–327.
Zoski, K. W. and S. Jurs (1996). “An Objective Counterpart to the Visual Scree Test for Factor Analysis: The Standard Error Scree,” Educational and Psychological Measurement, 56(3), 443–451.
Zwick, W. R. and W. F. Velicer (1986). “Factors Influencing Five Rules for Determining the Number of Components to Retain,” Psychological Bulletin, 99(3), 432–442.