Skip to main content
Published Online:https://doi.org/10.1027/1015-5759/a000224

Abstract. Recent advancements in the assessment of Complex Problem Solving (CPS) build on the use of homogeneous tasks that enable the reliable estimation of CPS skills. The range of problems featured in established instruments such as MicroDYN is consequently limited to a specific subset of homogeneous complex problems. This restriction is problematic when looking at domain-specific examples of complex problems, which feature characteristics absent from current assessment instruments (e.g., threshold states). We propose to utilize the formal framework of Finite State Automata (FSA) to extend the range of problems included in CPS assessment. An approach based on FSA, called MicroFIN, is presented, translated into specific tasks, and empirically investigated. We conducted an empirical study (N = 576), (1) inspecting the psychometric features of MicroFIN, (2) relating it to MicroDYN, and (3) investigating the relations to a measure of reasoning (i.e., CogAT). MicroFIN (1) exhibited adequate measurement characteristics and multitrait-multimethod models indicated (2) the convergence of latent dimensions measured with MicroDYN. Relations to reasoning (3) were moderate and comparable to the ones previously found for MicroDYN. Empirical results and corresponding explanations are discussed. More importantly, MicroFIN highlights the feasibility of expanding CPS assessment to a larger spectrum of complex problems.

References

  • Ackerman, P. L. (1992). Predicting individual differences in complex skill acquisition: Dynamics of ability determinants. Journal of Applied Psychology, 77, 598–614. First citation in articleCrossrefGoogle Scholar

  • Anderson, J. A. (2006). Automata theory with modern applications, Cambridge, MA: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Beckmann, J. F. & Goode, N. (2013). The benefit of being naïve and knowing it: The unfavourable impact of perceived context familiarity on learning in complex problem solving tasks. Instructional Science, 42, 1–20. First citation in articleGoogle Scholar

  • Brehmer, B. (1992). Dynamic decision making: Human control of complex systems. Acta Psychologica, 81, 211–241. First citation in articleCrossrefGoogle Scholar

  • Buchner, A. (1995). Basic topics and approaches to the study of complex problem solving. In P. A. FrenschJ. FunkeEds.. Complex problem solving: The European perspective (pp. 27–63). Hillsdale, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Buchner, A. & Funke, J. (1993). Finite-state automata: Dynamic task environments in problem-solving research. The Quarterly Journal of Experimental Psychology, 46, 83–118. First citation in articleCrossrefGoogle Scholar

  • Bühner, M., Kröner, S. & Ziegler, M. (2008). Working memory, visual-spatial-intelligence and their relationship to problem-solving. Intelligence, 36, 672–680. First citation in articleCrossrefGoogle Scholar

  • Dörner, D., Kreuzig, H. W., Reither, F. & Stäudel, T. (Eds.). (1983). Lohausen. Vom Umgang mit Unbestimmtheit und Komplexität [Lohhausen. On dealing with uncertainty and complexity]. Bern, Switzerland: Huber. First citation in articleGoogle Scholar

  • Eid, M., Lischetzke, T., Nussbeck, F. W. & Trierweiler, L. I. (2003). Separating trait effects from trait-specific method effects in multitrait-multimethod models: A multiple-indicator CT-C(M-1) model. Psychological Methods, 8, 38–60. First citation in articleCrossrefGoogle Scholar

  • Fischer, A., Greiff, S. & Funke, J. (2012). The process of solving complex problems. Journal of Problem Solving, 4, 19–42. First citation in articleCrossrefGoogle Scholar

  • Funke, J. (2001). Dynamic systems as tools for analysing human judgement. Thinking & Reasoning, 7, 69–89. First citation in articleCrossrefGoogle Scholar

  • Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142. First citation in articleCrossrefGoogle Scholar

  • Gonzalez, C., Vanyukov, P. & Martin, M. K. (2005). The use of microworlds to study dynamic decision making. Computers in Human Behavior, 21, 273–286. First citation in articleCrossrefGoogle Scholar

  • Greiff, S., Fischer, A., Wüstenberg, S., Sonnleitner, P., Brunner, M. & Martin, R. (2013). A multitrait-multimethod study of assessment instruments for complex problem solving. Intelligence, 41(5), 579–596. First citation in articleCrossrefGoogle Scholar

  • Greiff, S. & Funke, J. (2009). Measuring complex problem solving: the MicroDYN approach. In F. ScheuermannJ. BjörnssonEds.. The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 157–163). Luxembourg: Office for Official Publications of the European Communities. First citation in articleGoogle Scholar

  • Greiff, S. & Wüstenberg, S. (2014). Assessment with microworlds using MicroDYN: Measurement invariance and latent mean comparisons: Psychometric properties across several student samples and blue-collar workers. European Journal of Psychological Assessment, Advance online publication. doi: 10.1027/1015-5759/a000194 First citation in articleGoogle Scholar

  • Greiff, S., Wüstenberg, S. & Funke, J. (2012). Dynamic problem solving: A new assessment perspective. Applied Psychological Measurement, 36, 189–213. First citation in articleCrossrefGoogle Scholar

  • Greiff, S., Wüstenberg, S., Molnár, G., Fischer, A., Funke, J. & Csapó, B. (2013). Complex problem solving in educational contexts – something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology, 105, 364–379. First citation in articleCrossrefGoogle Scholar

  • Griffin, P., McGaw, B. & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. New York, NY: Springer. First citation in articleCrossrefGoogle Scholar

  • Heller, K. A. & Perleth, C. (2000). Kognitiver Fähigkeitstest für 4. bis 12. Klassen, Revision [Cognitive Abilities Test (CogAT; Thorndike, L. & Hagen, E., 1954–1986) – German adapted version]. Göttingen, Germany: Beltz-Test. First citation in articleGoogle Scholar

  • Hu, L. & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6, 1–55. First citation in articleCrossrefGoogle Scholar

  • Levy, S. T. & Wilensky, U. (2011). Mining students’ inquiry actions for understanding of complex systems. Computers & Education, 56, 556–573. First citation in articleCrossrefGoogle Scholar

  • Little, T. D., Cunningham, W. A., Shahar, G. & Widaman, K. F. (2002). To parcel or not to parcel: Exploring the question, weighing the merits. Structural Equation Modeling: A Multidisciplinary Journal, 9, 151–173. First citation in articleCrossrefGoogle Scholar

  • McElhaney, K. W. & Linn, M. C. (2011). Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters. Journal of Research in Science Teaching, 48, 745–770. First citation in articleCrossrefGoogle Scholar

  • Muthén, B. O., du Toit, S. H. C. & Spisic, D. (1997). Robust inference using weighted least squares and quadratic estimating equations in latent variable modeling with categorical and continuous outcomes, Unpublished technical report. Retrieved from http://pages.gseis.ucla.edu/faculty/muthen/articles/Article_075.pdf First citation in articleGoogle Scholar

  • Muthén, L. K. & Muthén, B. O. (2012). Mplus user’s guide, (7th ed.). Los Angeles, CA: Muthén & Muthén. First citation in articleGoogle Scholar

  • Novick, L. R. & Bassok, M. (2005). Problem solving. In K. J. HolyoakR. G. MorrisonEds.. The Cambridge handbook of thinking and reasoning (pp. 321–349). New York, NY: Cambridge University Press. First citation in articleGoogle Scholar

  • Novick, M. R. (1966). The axioms and principal results of classical test theory. Journal of Mathematical Psychology, 3, 1–18. First citation in articleCrossrefGoogle Scholar

  • OECD. (2013). PISA 2012 assessment and analytical framework. Paris, France: Organisation for Economic Co-operation and Development. First citation in articleCrossrefGoogle Scholar

  • OECD. (2014). PISA 2012 Results: Creative problem solving (Volume V). Paris, France: OECD Publishing. First citation in articleCrossrefGoogle Scholar

  • Osman, M. (2010). Controlling uncertainty: A review of human behavior in complex dynamic environments. Psychological Bulletin, 136, 65–86. First citation in articleCrossrefGoogle Scholar

  • Raven, J., Raven, J. C. & Court, J. H. (1998). Manual for Raven’s progressive matrices and vocabulary scales. Oxford, UK: Oxford Psychologists Press. First citation in articleGoogle Scholar

  • Rich, E. (2008). Automata, computability and complexity: Theory and applications. Upper Saddle River, NJ: Prentice Hall. First citation in articleGoogle Scholar

  • Rollett, W. (2008). Strategieeinsatz, erzeugte Information und Informationsnutzung bei der Exploration und Steuerung komplexer dynamischer Systeme [Use of strategy, generated information and use of information when exploring and controlling complex dynamic systems]. Berlin, Germany: Lit. First citation in articleGoogle Scholar

  • Scherer, R. & Tiemann, R. (2012). Factors of problem-solving competency in a virtual chemistry environment: The role of metacognitive knowledge about strategies. Computers & Education, 59, 1199–1214. First citation in articleCrossrefGoogle Scholar

  • Schweizer, F., Wüstenberg, S. & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 42–52. First citation in articleCrossrefGoogle Scholar

  • Segerer, R., Marx, A. & Marx, P. (2012). Unlösbare Items im KFT 4–12+R [Unsolvable items in the CFT 4–12+R]. Diagnostica, 58, 45–50. First citation in articleLinkGoogle Scholar

  • Sonnleitner, P., Brunner, M., Greiff, S., Funke, J., Keller, U., Martin, R., … Latour, T. (2012). The Genetics Lab: Acceptance and psychometric characteristics of a computer-based microworld assessing complex problem solving. Psychological Test and Assessment Modeling, 54, 54–72. First citation in articleGoogle Scholar

  • Sonnleitner, P., Keller, U., Martin, R. & Brunner, M. (2013). Students’ complex problem-solving abilities: Their structure and relations to reasoning ability and educational success. Intelligence, 41, 289–305. First citation in articleCrossrefGoogle Scholar

  • Sternberg, R. J. & Frensch, P. A. (1991). Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Tschirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51, 1–10. First citation in articleCrossrefGoogle Scholar

  • Vollmeyer, R., Burns, B. D. & Holyoak, K. J. (1996). The impact of goal specificity on strategy use and the acquisition of problem structure. Cognitive Science, 20, 75–100. First citation in articleCrossrefGoogle Scholar

  • Wittmann, W. & Süß, H.-M. (1999). Investigating the paths between working memory, intelligence, knowledge, and complex problem-solving performances via Brunswik symmetry. In P. L. AckermanP. C. KyllonenR. D. RobertsEds.. Learning and individual differences: Process, trait, and content determinants, Washington, DC: American Psychological Association, 77–108. First citation in articleGoogle Scholar

  • Wüstenberg, S., Greiff, S. & Funke, J. (2012). Complex problem solving - More than reasoning? Intelligence, 40, 1–14. First citation in articleCrossrefGoogle Scholar

  • Zinbarg, R. E., Revelle, W., Yovel, I. & Li, W. (2005). Cronbach’s α, Revelle’s β, and Mcdonald’s ωH: Their relations with each other and two alternative conceptualizations of reliability. Psychometrika, 70, 123–133. First citation in articleCrossrefGoogle Scholar