Skip to main content
Original Article

Convergent Evidence for the Validity of a Performance-Based ICT Skills Test

Published Online:https://doi.org/10.1027/1015-5759/a000507

Abstract. The goal of this study was to investigate sources of evidence of convergent validity supporting the construct interpretation of scores on a simulation-based information and communication technology (ICT) skills test. The construct definition understands ICT skills as reliant on ICT-specific knowledge as well as comprehension and problem-solving skills. On the basis of this, a validity argument comprising three claims was formulated and tested. (1) In line with the classical nomothetic span approach, all three predictor variables explained task success positively across all ICT skills items. As ICT tasks can vary in the extent to which they require construct-related knowledge and skills and in the way related items are designed and implemented, the effects of construct-related predictor variables were expected to vary across items. (2) A task-based analysis approach revealed that the item-level effects of the three predictor variables were in line with the targeted construct interpretation for most items. (3) Finally, item characteristics could significantly explain the random effect of problem-solving skills, but not comprehension skills. Taken together, the obtained results generally support the validity of the construct interpretation.

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education [AERA, APA, & NCME]. (2014). Standards for educational and psychological testing. Washington, DC: American Psychological Association. First citation in articleGoogle Scholar

  • Bates, D., Maechler, M., Bolker, B., & Walker, S. (2014). lme4: Linear mixed-effects models using Eigen and S4. R package version 1.1-7. Retrieved from http://CRAN.R-project.org/package=lme4 First citation in articleGoogle Scholar

  • Binkley, M., Erstad, O., Herman, J., Raizen, R., Ripley, M., & Rumble, M. (2012). Defining 21st century skills. In P. GriffinB. McGawE. CareEds., Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht, The Netherlands: Springer. First citation in articleGoogle Scholar

  • Bolker, B. M., Brooks, M. E., Clark, C. J., Geange, S. W., Poulsen, J. R., Stevens, M. H. H., & White, J. S. S. (2009). Generalized linear mixed models: A practical guide for ecology and evolution. Trends in Ecology & Evolution, 24, 127–135. https://doi.org/10.1016/j.tree.2008.10.008 First citation in articleCrossrefGoogle Scholar

  • Brysbaert, M., Buchmeier, M., Conrad, M., Jacobs, A. M., Bölte, J., & Böhl, A. (2011). The word frequency effect: A review of recent developments and implications for the choice of frequency estimates in German. Experimental Psychology, 58, 412–424. https://doi.org/10.1027/1618-3169/a000123 First citation in articleLinkGoogle Scholar

  • Calvani, A., Cartelli, A., Fini, A., & Ranieri, M. (2009). Models and instruments for assessing digital competence at school. Journal of e-Learning and Knowledge Society-English Version, 4, 183–193. Retrieved from http://www.je-lks.org/ojs/index.php/Je-LKS_EN/article/view/288/270 First citation in articleGoogle Scholar

  • Coke, E. U., & Rothkopf, E. Z. (1970). Note on a simple algorithm for a computer-produced reading ease score. Journal of Applied Psychology, 54, 208–210. https://doi.org/10.1037/h0029067 First citation in articleCrossrefGoogle Scholar

  • Embretson, S. E. (1983). Construct validity: Construct representation versus nomothetic span. Psychological Bulletin, 93, 179–197. https://doi.org/10.1037/0033-2909.93.1.179 First citation in articleCrossrefGoogle Scholar

  • Engelhardt, L., Goldhammer, F., Naumann, J., & Frey, A. (2017). Experimental validation strategies for heterogeneous computer-based assessment items. Computers in Human Behavior, 76, 683–692. https://doi.org/10.1016/j.chb.2017.02.020 First citation in articleCrossrefGoogle Scholar

  • England, G. W., Thomas, M., & Paterson, D. G. (1953). Reliability of the original and the simplified Flesch reading ease formulas. Journal of Applied Psychology, 37, 111–113. https://doi.org/10.1037/h0055346 First citation in articleCrossrefGoogle Scholar

  • Eshet-Alkalai, Y. (2004). Digital literacy: A conceptual framework for survival skills in the digital era. Journal of Educational Multimedia and Hypermedia, 13, 93–107. Retrieved from http://www.openu.ac.il/Personal_sites/download/Digital-literacy2004-JEMH.pdf First citation in articleGoogle Scholar

  • European Parliament and the Council. (2006). Recommendation of the European Parliament and the Council of 18 December 2006 on key competences for lifelong learning. Official Journal of the European Union, L394. Retrieved from http://www.alfa-trall.eu/wp-content/uploads/2012/01/EU2007-keyCompetencesL3-brochure.pdf First citation in articleGoogle Scholar

  • Ferrari, A., Punie, Y., & Redecker, C. (2012). Understanding digital competence in the 21st century: An analysis of current frameworks. In A. RavenscroftS. LindstaedtC. D. KloosD. Hernández-LeoEds., 21st Century Learning for 21st Century Skills (pp. 79–92). Berlin, Heidelberg, Germany: Springer. https://doi.org/10.1007/978-3-642-33263-0_7 First citation in articleGoogle Scholar

  • Flesch, R. (1948). A new readability yardstick. Journal of Applied Psychology, 32, 221–233. https://doi.org/10.1037/h0057532 First citation in articleCrossrefGoogle Scholar

  • Fraillon, J., & Ainley, J. (2010). The IEA international study of computer and information literacy (ICILS). Retrieved from http://www.researchgate.net/profile/John_Ainley/publication/268297993_The_IEA_International_Study_of_Computer_and_Information_Literacy_%28ICILS%29/links/54eba4330cf2082851be49a9.pdf First citation in articleGoogle Scholar

  • Goldhammer, F., Naumann, J., & Kessel, Y. (2013). Assessing individual differences in basic computer skills. European Journal of Psychological Assessment, 29, 263–275. https://doi.org/10.1027/1015-5759/a000153 First citation in articleLinkGoogle Scholar

  • Graesser, A. C., Hoffman, N. L., & Clark, L. F. (1980). Structural components of reading time. Journal of Verbal Learning and Verbal Behavior, 19, 135–151. First citation in articleCrossrefGoogle Scholar

  • Greiff, S., Wüstenberg, S., Holt, D. V., Goldhammer, F., & Funke, J. (2013). Computer-based assessment of complex problem-solving: Concept, implementation, and application. Educational Technology Research and Development, 61, 407–421. First citation in articleCrossrefGoogle Scholar

  • International ICT Literacy Panel. (2002). Digital Transformation: A Framework for ICT Literacy. Princeton, NJ: Educational Testing Service. Retrieved from http://www.ets.org/Media/Research/pdf/ICTREPORT.pdf First citation in articleGoogle Scholar

  • Just, M. A., & Carpenter, P. A. (1987). Speed reading. In M. A. JustP. A. CarpenterEds., The psychology of reading and language processing (pp. 425–452). Newton, MA: Allyn & Bacon. First citation in articleGoogle Scholar

  • Kaakinen, J. K., & Hyönä, J. (2010). Task effects on eye movements during reading. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36, 1561–1566. https://doi.org/10.1037/a0020693 First citation in articleCrossrefGoogle Scholar

  • Kiefer, T., Robitzsch, A., & Wu, M. (2016). TAM: Test analysis modules. R package version 1.99-6. Retrieved from http://CRAN.R-project.org/package=TAM First citation in articleGoogle Scholar

  • Kintsch, W. (1998). Comprehension: A paradigm for cognition. Cambridge, UK: Cambridge University Press. First citation in articleGoogle Scholar

  • Marton, K., Schwartz, R. G., & Braun, A. (2005). The effect of age and language structure on working memory performance. In B. G. B. L. BarsalouM. BucciarelliEds., Proceedings of the XXVII. Annual Meeting of the Cognitive Science Society (pp. 1413–1418). Mahwah, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Mesmer, H. A., & Hiebert, E. H. (2015). Third graders’ reading proficiency reading texts varying in complexity and length: Responses of students in an urban, high-needs school. Journal of Literacy Research, 47, 473–504. https://doi.org/10.1177/1086296x16631923 First citation in articleCrossrefGoogle Scholar

  • Michalke, M. (2017). koRpus: An R package for text analysis. Retrieved from https://reaktanz.de/?c=hacking&s=koRpus First citation in articleGoogle Scholar

  • Naumann, J., Goldhammer, F., Rölke, H., & Stelter, A. (2014). Erfolgreiches Problemlösen in technologiebasierten Umgebungen: Wechselwirkungen zwischen Interaktionsschritten und Aufgabenanforderungen [Successful problem-solving in technology rich environments: Interactions between number of actions and task demands]. Zeitschrift für Pädagogische Psychologie, 28, 193–203. https://doi.org/10.1024/1010-0652/a000134 First citation in articleLinkGoogle Scholar

  • Naumann, J., & Sälzer, C. (2017). Digital reading proficiency in german 15-year olds: Evidence from PISA 2012. Zeitschrift für Erziehungswissenschaft, 20, 585–603. https://doi.org/10.1007/s11618-017-0758-y First citation in articleCrossrefGoogle Scholar

  • OECD. (2012). Literacy, numeracy, and problem-solving in technology-rich environments: Framework for the OECD Survey of Adult Skills. Paris, France: OECD Publishing. https://doi.org/10.1787/9789264128859-en First citation in articleCrossrefGoogle Scholar

  • Perfetti, C. (2007). Reading ability: Lexical quality to comprehension. Scientific Studies of Reading, 11, 357–383. First citation in articleCrossrefGoogle Scholar

  • R Core Team. (2014). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from http://www.R-project.org/ First citation in articleGoogle Scholar

  • Richter, T., Isberner, M.-B., Naumann, J., & Kutzner, Y. (2013). Lexical quality and reading comprehension in primary school children. Scientific Studies of Reading, 17, 415–434. https://doi.org/10.1080/10888438.2013.764879 First citation in articleCrossrefGoogle Scholar

  • Richter, T., Naumann, J., & Horz, H. (2010). Das Inventar zur Computerbildung (revidierte Fassung) [A revised version of the Computer Literacy Inventory]. Zeitschrift für Pädagogische Psychologie, 24, 23–37. https://doi.org/10.1024/1010-0652/a000002 First citation in articleLinkGoogle Scholar

  • Rölke, H. (2012). The ItemBuilder: A graphical authoring system for complex item development. In T. BastiaensG. MarksEds., Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2012 (pp. 344–353). Chesapeake, VA: AACE. First citation in articleGoogle Scholar

  • Schindler, J., Richter, T., Isberner, M.-B., Naumann, J., & Neeb, Y. (2018). Construct validity of a process-oriented test assessing syntactic skills in German primary school children. Language Assessment Quarterly: An International Journal, 15, 183–203. https://doi.org/10.1080/15434303.2018.1446142 First citation in articleCrossrefGoogle Scholar

  • Schneider, W., Schlagmüller, M., & Ennemoser, M. (2007). LGVT 6–12: Lesegeschwindigkeits- und -verständnistest für die Klassen 6–12 [Reading Speed and Comprehension Test for Grades 6 to 12]. Göttingen, Germany: Hogrefe. First citation in articleGoogle Scholar

  • Schnotz, W. (2005). An integrated model of text and picture comprehension. In R. E. MayerEd., The Cambridge handbook of multimedia learning (pp. 49–69). New York, NY: Cambridge University Press. First citation in articleGoogle Scholar

  • Siddiq, F., Hatlevik, O. E., Olsen, R. V., Throndsen, I., & Scherer, R. (2016). Taking a future perspective by learning from the past–A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educational Research Review, 19, 58–84. First citation in articleCrossrefGoogle Scholar

  • Simon, H. A., & Newell, A. (1971). Human problem-solving: The state of the theory in 1970. American Psychologist, 26, 145–159. https://doi.org/10.1037/h0030806 First citation in articleCrossrefGoogle Scholar

  • Soares, A. P., Machado, J., Costa, A., Iriarte, Á., Simões, A., de Almeida, J. J., … Perea, M. (2015). On the advantages of word frequency and contextual diversity measures extracted from subtitles: The case of Portuguese. The Quarterly Journal of Experimental Psychology, 68, 680–696. https://doi.org/10.1080/17470218.2014.964271 First citation in articleCrossrefGoogle Scholar

  • Spitz, H. H., Webster, N. A., & Borys, S. V. (1982). Further studies of the Tower of Hanoi problem-solving performance of retarded young adults and nonretarded children. Developmental Psychology, 18, 922–930. https://doi.org/10.1037/0012-1649.18.6.922 First citation in articleCrossrefGoogle Scholar

  • Stadler, M., Niepel, C., & Greiff, S. (2016). Easily too difficult: Estimating item difficulty in computer simulated microworlds. Computers in Human Behavior, 65, 100–106. https://doi.org/10.1016/j.chb.2016.08.025 First citation in articleCrossrefGoogle Scholar

  • Walkington, C., Clinton, V., Ritter, S. N., & Nathan, M. J. (2015). How readability and topic incidence relate to performance on mathematics story problems in computer-based curricula. Journal of Educational Psychology, 107, 1051–1074. https://doi.org/10.1037/edu0000036 First citation in articleCrossrefGoogle Scholar

  • Wenzel, S. F. C., Engelhardt, L., Hartig, K., Kuchta, K., Frey, A., Goldhammer, F., … Horz, H. (2016). Computergestützte, adaptive und verhaltensnahe Erfassung Informations- und Kommunikationstechnologie-bezogener Fertigkeiten (ICT-Skills) (CavE-ICT) [Computer-based, adaptive and behavior related assessment of information and communication-related competencies (ICT skills)]. In In BMBFEds., Forschung in Ankopplung an Large-Scale Assessments (pp. 161–180). Bonn, Berlin: BMBF. First citation in articleGoogle Scholar

  • White, S. J., Warrington, K. L., McGowan, V. A., & Paterson, K. B. (2015). Eye movements during reading and topic scanning: Effects of word frequency. Journal of Experimental Psychology: Human Perception and Performance (pp. 233–248). 41. https://doi.org/10.1037/xhp0000020 First citation in articleCrossrefGoogle Scholar

  • Wilson, M., De Boeck, P., & Carstensen, C. H. (2008). Explanatory item response models: A brief introduction. In J. HartigE. KliemeD. LeutnerEds., Assessment of competencies in educational contexts (pp. 91–120). Göttingen, Germany: Hogrefe. First citation in articleGoogle Scholar