Skip to main content
Original Articles

Current Issues in Competence Modeling and Assessment

Published Online:https://doi.org/10.1027/0044-3409.216.2.61

The goals of education and qualification in modern industrial societies can no longer be described by a fixed set of specialized skills that are transferable from one generation to the next. Nowadays, knowledge must be applicable to different, new, and complex situations and contexts. It is against this background that the concept of competence has attracted increased research attention. Competencies are conceptualized as complex ability constructs that are context-specific, trainable, and closely related to real life. The theoretical modeling of competencies, their assessment, and the usage of assessment results in practice present new challenges for psychological and educational research. This article reviews current issues in competence modeling, outlining research questions and the current state of research, and identifying the need for more interdisciplinary research. Finally, a research program recently initiated by the German Research Foundation (DFG) to address these questions and demands is presented.

References

  • Adams, R. (Ed.). (2005). PISA 2003 technical report. Paris: OECD. First citation in articleGoogle Scholar

  • Adams, R. , Wilson, M. , Wang, W.-C. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21, 1–23. First citation in articleCrossrefGoogle Scholar

  • Adams, R. , Wilson, M. , Wu, M.L. (1997). Multilevel item response modeling: An approach to errors in variables regression. Journal of Educational and Behavioral Statistics, 22, 47–76. First citation in articleCrossrefGoogle Scholar

  • Adams, R. , Wu, M. (Eds.). (2002). PISA 2000 technical report. Paris: OECD. First citation in articleGoogle Scholar

  • Alderson, C. (2005). Diagnosing foreign language proficiency: The interface between learning and assessment. London: Continuum. First citation in articleGoogle Scholar

  • Alderson, C. , Figueras, N. , Kuijper, H. , Nold, G. , Takala, S. , Tardieu, C. (2005). The Dutch CEF grid reading/listening (revised internet version available for test development and analysis). Retrieved from http://www.lancs.ac.uk/fss/projects/grid/ First citation in articleGoogle Scholar

  • Amelang, M. , Funke, J. (2005). Entwicklung und Implementierung eines kombinierten Beratungs- und Auswahlverfahrens für die wichtigsten Studiengänge an der Universität Heidelberg [Development and implementation of a combined instrument for counseling and selection for the most important courses at the University of Heidelberg]. Psychologische Rundschau, 56, 135–137. First citation in articleLinkGoogle Scholar

  • Artelt, C. , Schiefele, U. , Schneider, W. (2001). Predictors of reading literacy. European Journal of Psychology of Education, 16, 363–383. First citation in articleCrossrefGoogle Scholar

  • Baumert, J. , Stanat, P. , Demmrich, A. (2001). PISA 2000: Untersuchungsgegenstand, theoretische Grundlagen und Durchführung der Studie [Subject, theoretical background, and implementation of the study]. In Deutsches PISA-Konsortium (Eds.), PISA 2000. Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich [PISA 2000. Students’ basic competencies in international comparison] (pp. 15–68). Opladen: Leske & Budrich. First citation in articleCrossrefGoogle Scholar

  • Beck, B. , Klieme, E. (Eds.). (2007). Sprachliche Kompetenzen – Konzepte und Messung [Language competencies – Concepts and measurement]. Weinheim: Beltz. First citation in articleGoogle Scholar

  • Blum, W. , Neubrand, M. , Ehmke, T. , Senkbeil, M. , Jordan, A. , Ulfig, F. et al. (2004). Mathematische Kompetenz [Mathematical literacy]. In PISA-Konsortium Deutschland (Eds.), ISA 2003. Der Bildungsstand der Jugendlichen in Deutschland – Ergebnisse des zweiten internationalen Vergleichs [PISA 2003: Educational outcomes of German students – Results of the second international study] (pp. 47–92). Münster: Waxmann. First citation in articleGoogle Scholar

  • Bybee, R.W. (1997). Toward an understanding of scientific literacy. In W. Gräber, C. Bolte (Eds.), Scientific literacy, an international Symposium (pp. 37–68). Kiel: IPN. First citation in articleGoogle Scholar

  • Chen, L. (2004). On text structure, language proficiency, and reading comprehension test format interactions: A reply to Kobayashi, 2002. Language Testing, 21, 228–234. First citation in articleCrossrefGoogle Scholar

  • Cheng, L. , Watanabe, Y. , Curtis, A. (Eds.). (2004). Washback in language testing: Research contexts and methods. Mahwah, NJ: Erlbaum. First citation in articleCrossrefGoogle Scholar

  • Chung, G.K.W.K. , O’Neil, H.F. , Baker, E.L. (in press). Computer-based assessments to support distance learning. In J. Hartig, E. Klieme, D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects. Göttingen: Hogrefe & Huber. First citation in articleGoogle Scholar

  • Cizek, G.J. (2001). More unintended consequences of high-stakes testing. Educational Measurement, Issues, and Practice, 20(4), 19–28. First citation in articleCrossrefGoogle Scholar

  • Cizek, G.J , Bunch, M.B. , Koons, H. (2004). A NCME instructional module on setting performance standards: Contemporary methods. Educational Measurement, Issues and Practice, 23, 31–50. First citation in articleCrossrefGoogle Scholar

  • Connell, M.W. , Sheridan, K. , Gardner, H. (2003). On abilities and domains. In R.J. Sternberg, E.L. Grigorenko (Eds.), The psychology of abilities, competencies, and expertise (pp. 126–155). Cambridge, MA: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Csapó, B. (2004). Knowledge and competencies. In J. Letschert (Ed.), The integrated person. How curriculum development relates to new competencies (pp. 35–49). Enschede: CIDREE/ SLO. First citation in articleGoogle Scholar

  • DiBello, L.V. , Stout, W. (2007). Guest editors’ introduction and overview: IRT-based cognitive diagnostic models and related methods. Journal of Educational Measurement, 44, 285–291. First citation in articleCrossrefGoogle Scholar

  • diSessa, A. (2006). A history of conceptual change research. In K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 265–281). Cambridge, MA: Cambridge University Press. First citation in articleGoogle Scholar

  • Dossey, J. , Hartig, J. , Klieme, E. , Wu, M. (2004). Problem solving for tomorrow’s world. First measures of cross-curricular competencies from PISA 2003. Paris: OECD. First citation in articleGoogle Scholar

  • Drasgow, F. (2002). The work ahead: A psychometric infrastructure for computerized adaptive testing. In C.N. Mills, M.T. Potenza, J.J. Fremer, W.C. Wards (Eds.), Computer-based testing. Building the foundation for future assessments (pp. 1–35). Mahwah, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Embretson, S.E. (1983). Construct validity: construct representation vs. nomothetic span. Psychological Bulletin, 93, 179–197. First citation in articleCrossrefGoogle Scholar

  • Embretson, S.E. (2006). The continued search for nonarbitrary metrics in psychology. American Psychologist, 61, 50–55. First citation in articleCrossrefGoogle Scholar

  • ETS (2005). TOEFL iBT at a glance. Retrieved September 25, 2005, from http://www.ets.org/Media/Test/TOEFL/pdf/TOEFL_at_ a_Glance.pdf First citation in articleGoogle Scholar

  • Fischer, G.H. (1973). The linear logistic test model as an instrument in educational research. Acta Psychologica, 37, 359–374. First citation in articleCrossrefGoogle Scholar

  • Frensch, P.A. , Haider, H. , Rünger, D. , Neugebauer, U. , Voigt, S. , Werg, D. (2003). The route from implicit learning to awareness of what has been learned. In L. Jiménez (Ed.), Attention and implicit learning (pp. 335–366). New York: John Benjamins. First citation in articleCrossrefGoogle Scholar

  • Frey, A. , Hartig, J. , Ketzel, A. , Zinkernagel, A. , Moosbrugger, H. (2007). Usability and internal validity of a modification of the computer game Quake III Arena® for the use in psychological experiments. Computers in Human Behavior, 23, 2026–2039. First citation in articleCrossrefGoogle Scholar

  • Fuhrmann, S.H. , Elmore, R.F. (Eds.). (2004). Redesigning accountability systems for education. New York: Teachers College Press. First citation in articleGoogle Scholar

  • Gierl, M.J. , Leighton, J.P. , Hunka, S.M. (2007). Using the attribute hierarchy method to make inferences about examinees’ cognitive skills. In M.J. Gierl, J.P. Leighton (Eds.), Cognitive diagnostic assessment for education (pp. 242–274). Cambridge, MA: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Gogolin, I. (2002). Linguistic and cultural diversity in Europe: A challenge for educational research and practice. European Educational Research Journal, 1, 123–138. First citation in articleCrossrefGoogle Scholar

  • Gold, A. , Souvignier, E. (2005). Prognose der Studierfähigkeit. Ergebnisse aus Längsschnittanalysen [Prediction of college graduation. Results from longitudinal studies]. Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie, 37, 214–222. First citation in articleLinkGoogle Scholar

  • Groot, A.S. , de Sonneville, M.J. , Stins, J.F. (2004). Familial influences on sustained attention and inhibition in preschoolers. Journal of Child Psychology and Psychiatry and Allied Disciplines, 45, 306–314. First citation in articleCrossrefGoogle Scholar

  • Haertel, E.H. , Lorié, W.A. (2004). Validating standards-based test score interpretations. Measurement: Interdisciplinary Research and Perspectives, 2, 61–103. First citation in articleCrossrefGoogle Scholar

  • Haider, H. , Frensch, P.A. (1996). The role of information reduction in skill acquisition. Cognitive Psychology, 30, 304–337. First citation in articleCrossrefGoogle Scholar

  • Haider, H. , Frensch, P.A. (1997). Lernmechanismen des kognitiven Fertigkeitserwerbs [Learning mechanisms in cognitive skill acquisition]. Zeitschrift für Experimentelle Psychologie, 44, 521–560. First citation in articleGoogle Scholar

  • Haider, H. , Frensch, P.A. (2002). Why individual learning does not follow the power law of practice but aggregated learning does: Comment on Rickard (1997, 1999), Delaney et al. (1998), and Palmeri (1999). Journal of Experimental Psychology: Learning, Memory, and Cognition, 28, 392–406. First citation in articleCrossrefGoogle Scholar

  • Hartig, J. , Frey, A. (2005, July). Application of different explanatory item response models for model based proficiency scaling. Paper presented at the 70th Annual Meeting of the Psychometric Society in Tilburg. First citation in articleGoogle Scholar

  • Hartig, J. , Höhler, J. (2008). Representation of competencies in multidimensional IRT models with within- and between-item multidimensionality. Zeitschrift für Psychologie / Journal of Psychology, 216, 88–100. First citation in articleGoogle Scholar

  • Hartig, J. , Klieme, E. (2006). Kompetenz und Kompetenzdiagnostik [Competence and competence diagnosis]. In K. Schweizer (Ed.), Leistung und Leistungsdiagnostik [Performance and assessment of performance] (pp. 127–143). Berlin: Springer-Verlag. First citation in articleCrossrefGoogle Scholar

  • Hartig, J. , Klieme, E. (2007, August). From theoretical notions of competence to adequate psychometric models. Paper presented at the 12th Biennial EARLI Conference, Budapest, Hungary. First citation in articleGoogle Scholar

  • Hartig, J. , Kröhne, U. , Jurecka, A. (2007). Anforderungen an Computer- und Netzwerkbasiertes Assessment [Requirements for computer- and network-based assessments]. In J. Hartig, E. Klieme (Eds.), Möglichkeiten und Voraussetzungen technologiebasierter Kompetenzdiagnostik [Possibilities and preconditions for technology-based assessment of competencies] (pp. 57–67). Berlin: Federal Ministry of Education and Research (available at http://www.bmbf.de/pub/band_zwanzig_bildungsforschung.pdf). First citation in articleGoogle Scholar

  • Hasselhorn, M. , Grube, D. (2003). The phonological similarity effect on memory span in children: Does it depend on age, speech rate, and articulatory suppression? International Journal of Behavioral Development, 27, 145–152. First citation in articleCrossrefGoogle Scholar

  • Huff, K. , Goodmann, D.P. (2007). The demand for cognitive diagnostic assessment. In M.J. Gierl, J.P. Leighton (Eds.), Cognitive diagnostic assessment for education (pp. 19–61). Cambridge, MA: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Janssen, R. , Schepers, J. , Peres, D. (2004). Models with item and item group predictors. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models: A generalized linear and nonlinear approach (pp. 189–212). New York: Springer-Verlag. First citation in articleCrossrefGoogle Scholar

  • Janssen, R. , Tuerlinckx, F. , Meulders, M. , De Boeck, P. (2000). A hierarchical IRT model for criterion-referenced measurement. Journal of Educational and Behavioral Statistics, 25, 285–306. First citation in articleCrossrefGoogle Scholar

  • Jude, N. , Wirth, J. (2007). Neue Chancen bei der technologiebasierten Erfassung von Kompetenzen [New opportunities of technology based assessment of competencies]. In J. Hartig, E. Klieme (Eds.), Möglichkeiten und Voraussetzungen technologiebasierter Kompetenzdiagnostik [Possibilities and preconditions for technology-based assessment of competencies] (pp. 81–91). Berlin: Federal Ministry of Education and Research (available at www.bmbf.de/pub/band_zwanzig_bildungsforschung.pdf). First citation in articleGoogle Scholar

  • Jung, B. , Ahad, A. , Weber, M. (2005). The affective virtual patient: An e-learning tool for social interaction training within medical field. In Proceedings TESI 2005 – Training Education & Education International Conference. Kent, UK: Nexus Media (available at http://isnm.de/aahad/Downloads/AVP_TESI. pdf). First citation in articleGoogle Scholar

  • Klauer, K.J. (1978). Perspektiven pädagogischer Diagnostik [Perspectives of educational assessment at the individual level]. In K.J. Klauer (Ed.), Handbuch der Pädagogischen Diagnostik [Handbook of educational assessment] (pp. 3–4). Düsseldorf: Schwann. First citation in articleGoogle Scholar

  • Klauer, K.J. (1987). Kriteriumsorientierte Tests [Criterion-referenced tests]. Göttingen: Hogrefe. First citation in articleGoogle Scholar

  • Klauer, K.J. , Leutner, D. (2007). Lehren und Lernen. Einführung in die Instruktionspsychologie [Teaching and learning. Introduction to instructional psychology]. Weinheim: Beltz-PVU. First citation in articleGoogle Scholar

  • Klieme, E. , Avenarius, H. , Blum, W. , Döbrich, P. , Gruber, H. , Prenzel, M. et al. (2003). The development of national educational standards. An expertise. Berlin: Federal Ministry of Education and Research (available at www.bmbf.de/pub/the_development_of_national_educational_standards.pdf). First citation in articleGoogle Scholar

  • Klieme, E. , Eichler, W. , Helmke, A. , Lehmann, R.H. , Nold, G. , Rolff, H.-G. et al. (2008). Unterricht und Kompetenzerwerb in Deutsch und Englisch. Ergebnisse der DESI-Studie [Instruction and competence development in German and English. Results of the DESI study]. Weinheim: Beltz. First citation in articleGoogle Scholar

  • Klieme, E. , Funke, J. , Leutner, D. , Reimann, P. , Wirth, J. (2001). Problemlösen als fächerübergreifende Kompetenz? Konzeption und erste Resultate aus einer Schulleistungsstudie [Problem solving as cross-curricular competence? Concepts and first results from an educational assessment]. Zeitschrift für Pädagogik, 47, 179–200. First citation in articleGoogle Scholar

  • Klieme, E. , Leutner, D. (2006). Kompetenzmodelle zur Erfassung individueller Lernergebnisse und zur Bilanzierung von Bildungsprozessen. Beschreibung eines neu eingerichteten Schwerpunktprogramms bei der DFG [Competence models for assessing individual learning outcomes and evaluating educational processes. Description of a new priority program of the German Research Foundation, DFG]. Zeitschrift für Pädagogik, 52, 876–903. First citation in articleGoogle Scholar

  • Klieme, E. , Maag-Merki, K. , Hartig, J. (2007). Kompetenzbegriff und Bedeutung von Kompetenzen im Bildungswesen [The concept and relevance of competencies in education]. In J. Hartig, E. Klieme (Eds.), Möglichkeiten und Voraussetzungen technologiebasierter Kompetenzdiagnostik [Possibilities and preconditions for technology-based assessment of competencies] (pp. 5–16). Berlin: Federal Ministry of Education and Research (available at www.bmbf.de/pub/band_zwanzig_bildungsforschung.pdf). First citation in articleGoogle Scholar

  • Klieme, E. , Neubrand, M. , Lüdtke, O. (2001). Mathematische Grundbildung: Testkonzeption und Ergebnisse [Mathematical literacy: Assessment framework and results]. In Deutsches PISA-Konsortium (Eds.), PISA 2000. Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich [PISA 2000. Students’ basic competencies in international comparison] (pp. 139–190). Opladen: Leske & Budrich. First citation in articleCrossrefGoogle Scholar

  • Köller, O. (2004). Konsequenzen von Leistungsgruppierungen [Consequences of homogeneous groups with regard to school performance]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Kobayashi, M. (2002). Method effects on reading comprehension test performance: Text organization and response format. Language Testing, 19, 193–220. First citation in articleCrossrefGoogle Scholar

  • Kröner, S. , Plass, J.L. , Leutner, D. (2005). Intelligence assessment with computer simulations. Intelligence, 33, 347–368. First citation in articleCrossrefGoogle Scholar

  • Künsting, J. , Thillmann, H. , Wirth, J. , Fischer, H.E. , Leutner, D. (2008). Strategisches Experimentieren im naturwissenschaftlichen Unterricht [Strategic experimentation in science lessons]. Psychologie in Erziehung und Unterricht, 55, 1–15. First citation in articleGoogle Scholar

  • Leutner, D. (2002). The fuzzy relationship of intelligence and problem solving in computer simulations. Computers in Human Behavior, 18, 685–697. First citation in articleCrossrefGoogle Scholar

  • Leutner, D. , Plass, J.L. (1998). Measuring learning styles with questionnaires versus direct observation of preferential choice behavior: Development of the Visualizer/Verbalizer Behavior Observation Scale (VV-BOS). Computers in Human Behavior, 14, 543–557. First citation in articleCrossrefGoogle Scholar

  • Leutner, D. , Fleischer, J. , Spoden, C. , Wirth, J. (2007). Landesweite Lernstandserhebung zwischen Bildungsmonitoring und Individualdiagnostik [State-wide standardized assessments of learning between educational monitoring and individual diagnostics]. Zeitschrift für Erziehungswissenschaft, Sonderheft 8, 149–167. First citation in articleGoogle Scholar

  • Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64, 187–212. First citation in articleCrossrefGoogle Scholar

  • McClelland, D.C. (1973). Testing for competence rather than for “intelligenceAmerican Psychologist, 28, 1–14. First citation in articleCrossrefGoogle Scholar

  • McDonald, R.P. (2000). A basis for multidimensional item response theory. Applied Psychological Measurement, 24, 99–114. First citation in articleCrossrefGoogle Scholar

  • Nichols, S.L. , Glass, G.V. , Berliner, D.C. (2006). High-stakes testing and student achievement: Does accountability pressure increase student learning? Education Policy Analysis Archives, 14 (available at http://epaa.asu.edu/epaa/v14n1/). First citation in articleCrossrefGoogle Scholar

  • Nold, G. (2003). DESI – A language assessment project in Germany and the pros and cons of large-scale testing. Empirische Pädagogik, 17, 368–379. First citation in articleGoogle Scholar

  • Nold, G. , Rossa, H. (2007a). Hörverstehen [Listening comprehension]. In B. Beck, E. Klieme (Eds.), Sprachliche Kompetenzen. Konzepte und Messung [Language competencies – Concepts and measurement] (pp. 178–196). Weinheim: Beltz. First citation in articleGoogle Scholar

  • Nold, G. , Rossa, H. (2007b). Leseverstehen [Reading comprehension]. In B. Beck, E. Klieme (Eds.), Sprachliche Kompetenzen. Konzepte und Messung [Language competencies – Concepts and measurement] (pp. 197–211). Weinheim: Beltz. First citation in articleGoogle Scholar

  • Oberauer, K. , Schulze, R. , Wilhelm, O. , Süß, H.-M. (2005). Working memory and intelligence – Their correlation and their relation: A comment on Ackerman, Beier, and Boyle (2005). Psychological Bulletin, 131, 61–65. First citation in articleCrossrefGoogle Scholar

  • OECD (2007a). PISA – Program for International Student Assessment (available at www.oecd.org/dataoecd/51/27/ 37474503.pdf). First citation in articleGoogle Scholar

  • OECD (2007b). PISA 2006 Science competencies for tomorrow’s world (Volume 1: Analysis). Paris: OECD. First citation in articleGoogle Scholar

  • Ordinate (2004). SET-10 test description and validation summary. Menlo Park, CA: Ordinate. First citation in articleGoogle Scholar

  • Pellegrino, J.W. , Chudowsky, N. , Glaser, R. (2001). Knowing what students know. The science and design of educational assessment. Washington, DC: National Academic Press. First citation in articleGoogle Scholar

  • Plass, J.L. , Chun, D. , Mayer, R.E. , Leutner, D. (1998). Supporting visualizer and verbalizer learning preferences in a second-language multimedia learning environment. Journal of Educational Psychology, 90, 25–36. First citation in articleCrossrefGoogle Scholar

  • Prenzel, M. , Allolio-Näcke, L. (Eds.). (2006). Untersuchungen zur Bildungsqualität von Schule. Abschlussbericht des DFG-Schwerpunktprogramms [Research on educational quality of schools. Final report of the DFG priority program]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Prenzel, M. , Artelt, C. , Baumert, J. , Blum, W. , Hammann, M. , Klieme, E. (Eds.). (2007). PISA 2006. Die Ergebnisse der dritten internationalen Vergleichsstudie [PISA 2006. Results of the third international study]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Prenzel, M. , Baumert, J. , Blum, W. , Lehmann, R. , Leutner, D. , Neubrand, M. et al. (Eds.). (2004). PISA 2003: Der Bildungsstand der Jugendlichen in Deutschland – Ergebnisse des zweiten internationalen Vergleichs [PISA 2003: Educational outcomes of German students – Results of the second international study]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Prenzel, M. , Baumert, J. , Blum, W. , Lehmann, R. , Leutner, D. , Neubrand, M. et al. (Eds.). (2005). PISA 2003: Der zweite Vergleich der Länder in Deutschland – Was wissen und können Jugendliche? [The second comparison of the German states – What do students know?]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Prenzel, M. , Rost, J. , Senkbeil, M. , Häußler, P. , & Klopp, A. (2001). Naturwissenschaftliche Grundbildung: Testkonzeption und Ergebnisse [Scientific literacy: Assessment framework and results]. In Deutsches PISA-Konsortium (Eds.), PISA 2000. Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich [PISA 2000. Students’ basic competencies in international comparison] (pp. 192–248). Opladen: Leske & Budrich. First citation in articleCrossrefGoogle Scholar

  • Prenzel, M. , Schöps, K. , Rönnebeck, S. , Senkbeil, M. , Walter, O. , Carstensen, C.H. et al. (2007). Naturwissenschaftliche Kompetenzen im internationalen Vergleich [Science competencies in international comparison]. In PISA-Konsortium Deutschland (Eds.), PISA 2006. Die Ergebnisse der dritten internationalen Vergleichsstudie [PISA 2006. Results of the third international comparison study] (pp. 63–105). Münster: Waxmann. First citation in articleGoogle Scholar

  • Reckase, M.D. (1997). A linear logistic multidimensional model for dichotomous item response data. In W.J. van der Linden & R.K. Hambleton (Eds.), Handbook of modern item response theory (pp. 271–286). New York: Springer-Verlag. First citation in articleCrossrefGoogle Scholar

  • Reeff, J.-P. (2007). Technische Lösungen für ein computer- und internetbasiertes Assessment-System [Technical solutions for computer- and internet-based assessment systems]. In J. Hartig, E. Klieme (Eds.), Möglichkeiten und Voraussetzungen technologiebasierter Kompetenzdiagnostik [Possibilities and preconditions for technology-based assessment of competencies] (pp. 81–91). Berlin: Federal Ministry of Education and Research (available at www.bmbf.de/pub/band_zwanzig_bildungsforschung.pdf). First citation in articleGoogle Scholar

  • Rychen, D.S. , Salganik, L.H. (Eds.). (2001). Defining and selecting key competencies. Seattle: Hogrefe & Huber. First citation in articleGoogle Scholar

  • Rychen, D.S. , Salganik, L.H. (Eds.). (2003). Key competencies for a successful life and a well-functioning society. Washington: Hogrefe & Huber. First citation in articleGoogle Scholar

  • Schneider, W. , Lockl, K. , Fernandez, O. (2005). Interrelationships among theory of mind, executive control, language development, and working memory in young children: A longitudinal analysis. In W. Schneider, R. Schumann-Hengsteler, B. Sodian (Eds.), Young children’s cognitive development: Interrelationships among executive functioning, working memory, verbal ability, and theory of mind (pp. 259–284). Mahwah, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Schnotz, W. , Eckhardt, A. , Molz, M. , Niegemann, H. , Hochscheid-Mauel, D. , Hessel, S. (2004). Deconstructing instructional design models toward an integrative conceptual framework for instructional design research. In H. Niegemann, R. Brünken, D. Leutner (Eds.), Instructional design and multimedia learning (pp. 71–89). Münster: Waxmann. First citation in articleGoogle Scholar

  • Schnotz, W. , Vosniadou, S. , Carretero, M. (Eds.). (1999). New perspectives on conceptual change. Oxford: Elsevier. First citation in articleGoogle Scholar

  • Segers, M. , Dochy, F. , Cascallar, E. (Eds.). (2003). Optimizing new modes of assessment: In search of quality and standards. Dordrecht: Kluwer. First citation in articleCrossrefGoogle Scholar

  • Simonton, K. (2003). Expertise, competence, and creative ability: The perplexing complexities. In R.J. Sternberg, E.L. Grigorenko (Eds.), The psychology of abilities, competencies, and expertise (pp. 213–239). Cambridge, MA: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Slavin, R.E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31, 15–21. First citation in articleCrossrefGoogle Scholar

  • Spiel, C. , Glück, J. (in press). A model based test of competence profile and competence level in deductive reasoning. In J. Hartig, E. Klieme, D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects. Göttingen: Hogrefe. First citation in articleGoogle Scholar

  • Sternberg, R.J. , Grigorenko, E. (Eds.). (2003). The psychology of abilities, competencies, and expertise. New York: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Stout, W. (2007). Skills diagnosis using IRT-based continuous latent trait models. Journal of Educational Measurement, 44, 313–324. First citation in articleCrossrefGoogle Scholar

  • van den Noortgate, W. , Paek, I. (2004). Person regression models. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models: A generalized linear and nonlinear approach (pp. 167–187). New York: Springer-Verlag. First citation in articleCrossrefGoogle Scholar

  • van der Linden, W. (2005). A comparison of item-selection methods for adaptive tests with content constraints. Journal of Educational Measurement, 42, 283–302. First citation in articleCrossrefGoogle Scholar

  • van der Linden, W. , Hambleton, R.K. (1996). Item response theory: Brief history, common models, and extensions. In W. van der Linden & R.K. Hambleton (Eds.), Handbook of modern item-response theory (pp. 1–28). Berlin: Springer-Verlag. First citation in articleGoogle Scholar

  • von Davier, M. (2005). A general diagnostic model applied to language testing data. ETS Research Report 0x-2005. RR-05-16. First citation in articleCrossrefGoogle Scholar

  • Vosniadou, S. , Ioannides, C. , Dimitrakopoulou, A. , Papademetriou, E. (2001). Designing learning environments to promote conceptual change in science. Learning and Instruction, 15, 317–419. First citation in articleGoogle Scholar

  • Walker, C.M. , Beratvas, S.N. (2003). Comparing multidimensional and unidimensional proficiency classifications: Multidimensional IRT as a diagnostic aid. Journal of Educational Measurement, 40, 255–275. First citation in articleCrossrefGoogle Scholar

  • Wang, W.-C. , Wilson, M. (2005). The Rasch testlet model. Applied Psychological Measurement, 29, 126–149. First citation in articleCrossrefGoogle Scholar

  • Weinert, F.E. (1999). Konzepte der Kompetenz [Concepts of competence]. Paris: OECD. First citation in articleGoogle Scholar

  • Weinert, F.E. (2001). Concept of competence: A conceptual clarification. In D.S. Rychen, L.H. Salganik (Eds.), Defining and selecting key competencies (pp. 45–65). Seattle: Hogrefe & Huber. First citation in articleGoogle Scholar

  • Weinert, F.E. , Helmke, A. , Schrader, F.-W. (1992). Research on the model teacher and the teaching model: Theoretical contradiction or conglutination? In F. Oser, A. Dick, J.L. Patry (Eds.), Effective and responsible teaching: The new synthesis (pp. 249–260). San Francisco: Jossey-Bass. First citation in articleGoogle Scholar

  • Weinert, F.E. , Schneider, W. (Eds.). (1995). Memory performance and competencies: Issues in growth and development. Hillsdale, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Whitely, S.E. (1981). Measuring aptitude processes with multicomponent latent trait models. Journal of Educational Measurement, 18, 67–84. First citation in articleCrossrefGoogle Scholar

  • Wilhelm, O. , Engle, R. (Eds.). (2005). Understanding and measuring intelligence. London: Sage. First citation in articleCrossrefGoogle Scholar

  • Wilson, M. (2008). Cognitive diagnosis using item response models. Zeitschrift für Psychologie / Journal of Psychology, 216, 73–87. First citation in articleGoogle Scholar

  • Wilson, M. , Draney, K. (2004). Some links between large-scale and classroom assessments: The case of the BEAR Assessment System. In M. Wilson (Ed.), Toward coherence between classroom assessment and accountability (103rd Yearbook of the National Society for the Study of Education, Part II, pp. 132–154). Chicago: University of Chicago Press. First citation in articleCrossrefGoogle Scholar

  • Wilson, M. , Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 13, 181–208. First citation in articleCrossrefGoogle Scholar

  • Wilson, M. , De Boeck, P. (2004). Descriptive and explanatory item response models. In P. De Boeck & M. Wilson (Eds.), Explanatory item response models. A generalized linear and nonlinear approach (pp. 43–74). New York: Springer-Verlag. First citation in articleCrossrefGoogle Scholar

  • Wilson, M. , De Boeck, P. , Carstensen, C.H. (in press). Explanatory item response models: A brief introduction. In J. Hartig, E. Klieme, D. Leutner (Eds.), Assessment of competencies in educational contexts: State of the art and future prospects. Göttingen: Hogrefe & Huber. First citation in articleGoogle Scholar

  • Wirth, J. (2004). Selbstregulation von Lernprozessen [Self-regulation in learning processes]. Münster: Waxmann. First citation in articleGoogle Scholar

  • Wirth, J. , Klieme, E. (2003). Computer-based assessment of problem-solving competence. Assessment in Education: Principles, Policy, and Practice, 10, 329–345. First citation in articleCrossrefGoogle Scholar

  • Wirth, J. , Leutner, D. (2008). Self-regulated learning as a competence. Implications of theoretical models for assessment methods. Zeitschrift für Psychologie / Journal of Psychology, 216, 101–109. First citation in articleGoogle Scholar