Skip to main content
Original Article

Structural Validity of the OSA Figures Scale for the Online Self-Assessment of Fluid Reasoning

Published Online:https://doi.org/10.1027/1015-5759/a000345

Abstract. This investigation provides evidence of the structural validity related to scores from the Online Self-Assessment (OSA) Figures scale. This scale was constructed for the assessment of figural reasoning as part of an online self-assessment battery. Since the appropriateness of confirmatory factor analysis (CFA) of data obtained by dichotomous items has been questioned, two suitable methods including different link transformations were considered: (1) confirmatory factor analysis including the congeneric model of measurement with tetrachoric correlations as input and (2) confirmatory factor analysis according to the weighted congeneric model with probability-based covariances as input. The models tested a unidimensional structure for the scale. Support for structural validity was identified through acceptable model-data fit indices and convergence of the parameter estimates across analysis methods. Furthermore, the OSA Figures scale showed an acceptable degree of homogeneity according to McDonald’s Omega and substantial correlation with course scores.

References

  • Arthur, W. Jr, Glaze, R. M., Villado, A. J. & Taylor, J. E. (2010). The magnitude and extent of cheating and response distortion effects on unproctored Internet-based tests of cognitive ability and personality. International Journal of Selection and Assessment, 18, 1–16. doi: 10.1111/j.1468-2389.2010.00476.x First citation in articleCrossrefGoogle Scholar

  • Carpenter, P. A., Just, M. A. & Shell, P. (1990). What one intelligence test measures: A theoretical account of the processing in the Raven Progressive Matrices Test. Psychological Review, 97, 404–431. doi: 10.1037/0033-295X.97.3.404 First citation in articleCrossrefGoogle Scholar

  • Carroll, J. B. (2005). The tree-stratum theory of cognitive abilities. In D. P. FlanaganP. L. HarrisonEds., Contemporary intellectual assessment: Theories, tests and issues (2nd ed., pp. 69–76). New York, NY: Guilford. First citation in articleGoogle Scholar

  • De Boeck, P. & Wilson, M. (2004). Explanatory item response models: A generalized linear and nonlinear approach. New York, NY: Springer. First citation in articleCrossrefGoogle Scholar

  • Fan, W. & Hancock, G. R. (2012). Robust means modeling: An alternative for hypothesis testing of independent means under variance heterogeneity and nonnormality. Journal of Educational and Behavioral Statistics, 37, 137–156. doi: 10.3102/1076998610396897 First citation in articleCrossrefGoogle Scholar

  • Finney, S. J. & DiStefano, C. (2013). Nonnormal and categorical data in structural equation modeling. In G. R. HancockR. O. MuellerEds., Structural equation modeling: A second course (2nd ed., pp. 439–492). Charlotte, NC: Information Age Publishing. First citation in articleGoogle Scholar

  • Forero, C. G., Maydeu-Olivares, A. & Gallardo-Pujol, D. (2009). Factor analysis with ordinal indicators: A Monte Carlo study comparing DWLS and ULS estimation. Structural Equation Modeling: A Multidisciplinary Journal, 16, 625–641. doi: 10.1080/10705510903203573 First citation in articleCrossrefGoogle Scholar

  • Formann, A. K. (2002). Wiener Matrizen-Test [Viennese Matrices Test]. Mödling, Austrian: Dr. G. Schuhfried GmbH. First citation in articleGoogle Scholar

  • Green, S. K. & Johnson, R. L. (2010). Assessment is essential. New York, NY: McGraw-Hill. First citation in articleGoogle Scholar

  • Hu, L. & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55. doi: 10.1080/10705519909540118 First citation in articleCrossrefGoogle Scholar

  • Johnson, R. L. & Morgan, G. (2016). Item types, response formats, and consequences for statistical investigations. In K. SchweizerC. DiStefanoEds., Principles and methods of test construction: standards and recent advancements (pp. 83–103). Göttingen, Germany: Hogrefe. First citation in articleGoogle Scholar

  • Jöreskog, K. G. (1971). Statistical analysis of sets of congeneric tests. Psychometrika, 36, 109–133. doi: 10.1007/BF02291393 First citation in articleCrossrefGoogle Scholar

  • Jöreskog, K. G. & Sörbom, D. (2006). LISREL 8.80. Lincolnwood, IL: Scientific Software International. First citation in articleGoogle Scholar

  • Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York, NY: The Guilford Press. First citation in articleGoogle Scholar

  • Kubinger, K. D. (2003). On artificial results due to using factor analysis for dichotomous variables. Psychology Science, 45, 106–110. First citation in articleGoogle Scholar

  • Maydeu-Olivares, A. (2006). Limited information estimation and testing of discretized multivariate normal structural models. Psychometrika, 71, 57–77. doi: 10.1007/s11336-005-0773-4 First citation in articleCrossrefGoogle Scholar

  • McCullagh, P. & Nelder, J. A. (1985). Generalized linear models. London, UK: Chapman and Hall. First citation in articleGoogle Scholar

  • McDonald, R. P. (1999). Test theory: A unified treatment. Mahwah, NJ: Erlbaum. First citation in articleGoogle Scholar

  • McDonald, R. P. & Ahlawat, K. S. (1974). Difficulty factors in binary data. British Journal of Mathematical and Statistical Psychology, 27, 82–99. doi: 10.1111/j.2044-8317.1974.tb00530.x First citation in articleCrossrefGoogle Scholar

  • McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the shoulder of the giants of psychometric intelligence research. Intelligence, 37, 1–10. doi: 10.1016/j.intell.2008.08.004 First citation in articleCrossrefGoogle Scholar

  • Muthén, B. (1984). A general structural equation model with dichotomous, ordered, categorical, and continuous latent variables indicators. Psychometrika, 32, 1–13. doi: 10.1007/BF02294210 First citation in articleCrossrefGoogle Scholar

  • Nelder, J. A. & Wedderburn, R. W. M. (1972). Generalized linear models. Journal of the Royal Statistical Society A, 135, 370–384. First citation in articleCrossrefGoogle Scholar

  • Pearson, K. (1900). Mathematical contributions to the theory of evolution. VII. On the correlation of characters not quantitatively measurable. Philosophical Transactions of the Royal Society of London Series A, 195, 1–47. First citation in articleCrossrefGoogle Scholar

  • Raven, J. C., Raven, J. & Court, J. H. (1997). Raven’s Progressive Matrices and Vocabulary Scales. Edinburgh, Scotland: J. C. Raven. First citation in articleGoogle Scholar

  • Reiß, S., Tillmann, A., Schreiner, M., Schweizer, K., Krömker, D. & Moosbrugger, H. (2009). Online-Self-Assessments zur Erfassung studienrelevanter Kompetenzen [Online Self-Assessments to measure study related competencies]. Zeitschrift für Hochschulentwicklung, 4, 60–71. First citation in articleGoogle Scholar

  • Savalei, V., Bonett, D. G. & Bentler, P. M. (2013). CFA with binary variables in small samples: A comparison of two methods. Frontiers in Psychology, 5, 1515. doi: 10.3389/fpsyg.2014.01515 First citation in articleCrossrefGoogle Scholar

  • Satorra, A. & Bentler, P. M. (1994). Corrections to the test statistics and standard errors on covariance structure analysis. In A. von EyeC. C. GloggEds., Latent variable analysis (pp. 399–419). Thousand Oaks, CA: Sage. First citation in articleGoogle Scholar

  • Schweizer, K. (2013). A threshold-free approach to the study of the structure of binary data. International Journal of Statistics and Probability, 2, 67–75. doi: 10.5539/ijsp.v2n2p67 First citation in articleCrossrefGoogle Scholar

  • Schweizer, K. & Reiss, S. (2014). The structural validity of the FPI Neuroticism Scale revisited in the framework of the generalized linear model. Psychological Test and Assessment Modeling, 56, 320–335. First citation in articleGoogle Scholar

  • Schweizer, K., Ren, X. & Wang, T. (2015). A comparison of confirmatory factor analysis of binary data on the basis of tetrachoric correlations and of probability-based covariances: A simulation study. In R. E. MillsapD. M. BoltL. A. van der ArkW.-C. WangEds., Quantitative Psychology Research (pp. 273–292). Heidelberg, Germany: Springer. First citation in articleGoogle Scholar

  • Skrondal, A. & Rabe-Hesketh, S. (2004). Generalized latent variable modelling: multilevel, longitudinal and structural equation models. Boca Raton, FL: Chapman & Hall/CRC. First citation in articleCrossrefGoogle Scholar

  • Torgerson, W. S. (1958). Theory and method of scaling. New York, NY: Wiley. First citation in articleGoogle Scholar

  • Wiley, J., Jarosz, A. F., Cushen, P. J. & Colflesh, G. J. H. (2011). New rule use drives the relation between working memory capacity and Raven’s Advanced Progressive Matrices. Journal of Experimental Psychology: Learning, Memory, and Cognition, 37, 256–263. doi: 10.1037/a0021613 First citation in articleCrossrefGoogle Scholar