Abstract
Abstract. Recent advancements in the assessment of Complex Problem Solving (CPS) build on the use of homogeneous tasks that enable the reliable estimation of CPS skills. The range of problems featured in established instruments such as MicroDYN is consequently limited to a specific subset of homogeneous complex problems. This restriction is problematic when looking at domain-specific examples of complex problems, which feature characteristics absent from current assessment instruments (e.g., threshold states). We propose to utilize the formal framework of Finite State Automata (FSA) to extend the range of problems included in CPS assessment. An approach based on FSA, called MicroFIN, is presented, translated into specific tasks, and empirically investigated. We conducted an empirical study (N = 576), (1) inspecting the psychometric features of MicroFIN, (2) relating it to MicroDYN, and (3) investigating the relations to a measure of reasoning (i.e., CogAT). MicroFIN (1) exhibited adequate measurement characteristics and multitrait-multimethod models indicated (2) the convergence of latent dimensions measured with MicroDYN. Relations to reasoning (3) were moderate and comparable to the ones previously found for MicroDYN. Empirical results and corresponding explanations are discussed. More importantly, MicroFIN highlights the feasibility of expanding CPS assessment to a larger spectrum of complex problems.
References
1992). Predicting individual differences in complex skill acquisition: Dynamics of ability determinants. Journal of Applied Psychology, 77, 598–614.
(2006). Automata theory with modern applications, Cambridge, MA: Cambridge University Press.
(2013). The benefit of being naïve and knowing it: The unfavourable impact of perceived context familiarity on learning in complex problem solving tasks. Instructional Science, 42, 1–20.
(1992). Dynamic decision making: Human control of complex systems. Acta Psychologica, 81, 211–241.
(1995).
(Basic topics and approaches to the study of complex problem solving . In P. A. FrenschJ. FunkeEds.. Complex problem solving: The European perspective (pp. 27–63). Hillsdale, NJ: Erlbaum.1993). Finite-state automata: Dynamic task environments in problem-solving research. The Quarterly Journal of Experimental Psychology, 46, 83–118.
(2008). Working memory, visual-spatial-intelligence and their relationship to problem-solving. Intelligence, 36, 672–680.
(1983). Lohausen. Vom Umgang mit Unbestimmtheit und Komplexität
. ([Lohhausen. On dealing with uncertainty and complexity] . Bern, Switzerland: Huber.2003). Separating trait effects from trait-specific method effects in multitrait-multimethod models: A multiple-indicator CT-C(M-1) model. Psychological Methods, 8, 38–60.
(2012). The process of solving complex problems. Journal of Problem Solving, 4, 19–42.
(2001). Dynamic systems as tools for analysing human judgement. Thinking & Reasoning, 7, 69–89.
(2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142.
(2005). The use of microworlds to study dynamic decision making. Computers in Human Behavior, 21, 273–286.
(2013). A multitrait-multimethod study of assessment instruments for complex problem solving. Intelligence, 41(5), 579–596.
(2009).
(Measuring complex problem solving: the MicroDYN approach . In F. ScheuermannJ. BjörnssonEds.. The transition to computer-based assessment: New approaches to skills assessment and implications for large-scale testing (pp. 157–163). Luxembourg: Office for Official Publications of the European Communities.2014). Assessment with microworlds using MicroDYN: Measurement invariance and latent mean comparisons: Psychometric properties across several student samples and blue-collar workers. European Journal of Psychological Assessment, Advance online publication. doi: 10.1027/1015-5759/a000194
(2012). Dynamic problem solving: A new assessment perspective. Applied Psychological Measurement, 36, 189–213.
(2013). Complex problem solving in educational contexts – something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology, 105, 364–379.
(2012). Assessment and teaching of 21st century skills. New York, NY: Springer.
. (2000). Kognitiver Fähigkeitstest für 4. bis 12. Klassen, Revision
([Cognitive Abilities Test (CogAT; Thorndike, L. & Hagen, E., 1954–1986) – German adapted version] . Göttingen, Germany: Beltz-Test.1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6, 1–55.
(2011). Mining students’ inquiry actions for understanding of complex systems. Computers & Education, 56, 556–573.
(2002). To parcel or not to parcel: Exploring the question, weighing the merits. Structural Equation Modeling: A Multidisciplinary Journal, 9, 151–173.
(2011). Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters. Journal of Research in Science Teaching, 48, 745–770.
(1997). Robust inference using weighted least squares and quadratic estimating equations in latent variable modeling with categorical and continuous outcomes, Unpublished technical report. Retrieved from http://pages.gseis.ucla.edu/faculty/muthen/articles/Article_075.pdf
(2012). Mplus user’s guide, (7th ed.). Los Angeles, CA: Muthén & Muthén.
(2005).
(Problem solving . In K. J. HolyoakR. G. MorrisonEds.. The Cambridge handbook of thinking and reasoning (pp. 321–349). New York, NY: Cambridge University Press.1966). The axioms and principal results of classical test theory. Journal of Mathematical Psychology, 3, 1–18.
(2013). PISA 2012 assessment and analytical framework. Paris, France: Organisation for Economic Co-operation and Development.
. (2014). PISA 2012 Results: Creative problem solving (Volume V). Paris, France: OECD Publishing.
. (2010). Controlling uncertainty: A review of human behavior in complex dynamic environments. Psychological Bulletin, 136, 65–86.
(1998). Manual for Raven’s progressive matrices and vocabulary scales. Oxford, UK: Oxford Psychologists Press.
(2008). Automata, computability and complexity: Theory and applications. Upper Saddle River, NJ: Prentice Hall.
(2008). Strategieeinsatz, erzeugte Information und Informationsnutzung bei der Exploration und Steuerung komplexer dynamischer Systeme
([Use of strategy, generated information and use of information when exploring and controlling complex dynamic systems] . Berlin, Germany: Lit.2012). Factors of problem-solving competency in a virtual chemistry environment: The role of metacognitive knowledge about strategies. Computers & Education, 59, 1199–1214.
(2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning and Individual Differences, 24, 42–52.
(2012). Unlösbare Items im KFT 4–12+R
([Unsolvable items in the CFT 4–12+R] . Diagnostica, 58, 45–50.2012). The Genetics Lab: Acceptance and psychometric characteristics of a computer-based microworld assessing complex problem solving. Psychological Test and Assessment Modeling, 54, 54–72.
(2013). Students’ complex problem-solving abilities: Their structure and relations to reasoning ability and educational success. Intelligence, 41, 289–305.
(1991). Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Erlbaum.
(1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51, 1–10.
(1996). The impact of goal specificity on strategy use and the acquisition of problem structure. Cognitive Science, 20, 75–100.
(1999).
(Investigating the paths between working memory, intelligence, knowledge, and complex problem-solving performances via Brunswik symmetry . In P. L. AckermanP. C. KyllonenR. D. RobertsEds.. Learning and individual differences: Process, trait, and content determinants, Washington, DC: American Psychological Association, 77–108.2012). Complex problem solving - More than reasoning? Intelligence, 40, 1–14.
(2005). Cronbach’s α, Revelle’s β, and Mcdonald’s ωH: Their relations with each other and two alternative conceptualizations of reliability. Psychometrika, 70, 123–133.
(