Skip to main content
Original Articles

Computer Adaptive-Attribute Testing

A New Approach to Cognitive Diagnostic Assessment

Published Online:https://doi.org/10.1027/0044-3409.216.1.29

The influence of interdisciplinary forces stemming from developments in cognitive science, mathematical statistics, educational psychology, and computing science are beginning to appear in educational and psychological assessment. Computer adaptive-attribute testing (CA-AT) is one example. The concepts and procedures in CA-AT can be found at the intersection between computer adaptive testing and cognitive diagnostic assessment. CA-AT allows us to fuse the administrative benefits of computer adaptive testing with the psychological benefits of cognitive diagnostic assessment to produce an innovative psychologically-based adaptive testing approach. We describe the concepts behind CA-AT as well as illustrate how it can be used to promote formative, computer-based, classroom assessment.

References

  • Anderson, J.R. (1996). ACT: A simple theory of complex cognition. American Psychologist, 51, 355–365. First citation in articleCrossrefGoogle Scholar

  • Bejar, I.I. (1996). Generative response modeling: Leveraging the computer as a test delivery medium (ETS Research Report >96–13). Princeton, NJ: Educational Testing Service. First citation in articleGoogle Scholar

  • Bejar, I.I. , Lawless, R. , Morley, M.E. , Wagner, M.E. , Bennett, R.E. , Revuelta, J. (2003). A feasibility study of on-the-fly item generation in adaptive testing. Journal of Technology, Learning, and Assessment, (, 2)3. Retrieved on July 8, 2006, from First citation in articleGoogle Scholar

  • Case, S.M. , & Swanson, D.B. (2002). Constructing written test questions for the basic and clinical sciences (3rd ed., revised). Philadelphia, PA: National Board of Medical Examiners. First citation in articleGoogle Scholar

  • Dawson, M.R.W. (1998). Understanding cognitive science. Malden, MA: Blackwell. First citation in articleGoogle Scholar

  • Dawson, M.R.W. (2004). Minds and machines: Connectionism and psychological modeling. Malden, MA: Blackwell. First citation in articleCrossrefGoogle Scholar

  • Fodor, J.A. (1983). The modularity of mind. Cambridge, MA: MIT Press. First citation in articleCrossrefGoogle Scholar

  • Gierl, M.J. (2007). Making diagnostic inferences about cognitive attributes using the rule space model and attribute hierarchy method. Journal of Educational Measurement, 44, 325–340. First citation in articleCrossrefGoogle Scholar

  • Gierl, M.J. , Cui, Y. , Hunka, S. (2007, April). Using connectionist models to evaluate examinees’ response patterns on tests. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL. First citation in articleGoogle Scholar

  • Gierl, M.J. , Leighton, J.P. , Hunka, S. (2007). Using the attribute hierarchy method to make diagnostic inferences about examinees’ cognitive skills. In J.P. Leighton, M.J. Gierl (Eds.), Cognitive diagnostic assessment for education: Theory and applications (pp. 242–274). Cambridge, UK: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Gierl, M.J. , Leighton, J.P. , Wang, C. , Zhou, J. , Gokiert, R. , & Tan, A. (2008). Developing and validating cognitive models of algebra performance on the SAT® (Research report). New York: The College Board. First citation in articleGoogle Scholar

  • Gierl, M.J. , Wang, C. , Zhou, J. (2008a). Using the attribute hierarchy method to make diagnostic inferences about examinees’ cognitive skills in algebra on the SAT®. Journal of Technology, Learning, and Assessment, 6(6). First citation in articleGoogle Scholar

  • Gierl, M.J. , Wang, C. , Zhou, J. (2008b). Using the attribute hierarchy method to develop cognitive models and evaluate problem-solving skills in algebra on the SAT®. (Research Report). New York: The College Board. First citation in articleGoogle Scholar

  • Haladyna, T. , Shindoll, R. (1989). Items shells: A method for writing effective multiple-choice test items. Evaluation and the Health Professions, 12, 97–106. First citation in articleCrossrefGoogle Scholar

  • LaDuca, A. , Staples, W.I. , Templeton, B. , Holzman, G.B. (1986). Item modeling procedure for constructing content-equivalent multiple-choice questions. Medical Education, 20, 53–56. First citation in articleCrossrefGoogle Scholar

  • Leighton, J.P. , Gierl, M.J. (Eds.) (2007a). Cognitive diagnostic assessment for education: Theory and practices. Cambridge, UK: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Leighton, J.P. , Gierl, M.J. (2007b). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees’ thinking processes. Educational Measurement: Issues and Practice, 26, 3–16. First citation in articleCrossrefGoogle Scholar

  • Leighton, J.P. , Gierl, M.J. , Hunka, S. (2004). The attribute hierarchy model: An approach for integrating cognitive theory with assessment practice. Journal of Educational Measurement, 41, 205–236. First citation in articleCrossrefGoogle Scholar

  • Luecht, R.M. (2006a, May). Engineering the test: From principled item design to automated test assembly. Paper presented at the annual meeting of the Society for Industrial and Organizational Psychology, Dallas, TX. First citation in articleGoogle Scholar

  • Luecht, R.M. (2006b, September). Assessment engineering: An emerging discipline. Paper presented in the Centre for Research in Applied Measurement and Evaluation, University of Alberta, Edmonton, Canada. First citation in articleGoogle Scholar

  • Luecht, R.M. , Gierl, M.J. , Tan, X. , Huff, K. (2006, April). Scalability and the development of useful diagnostic scales. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco, CA. First citation in articleGoogle Scholar

  • McClelland, J.L. (1998). Connectionist models and Bayesian inference. In M. Oaksford, N. Chater (Eds.), Rational models of cognition (pp. 21–53). Oxford, UK: Oxford University Press. First citation in articleGoogle Scholar

  • Mislevy, R.J. (2006). Cognitive psychology and educational assessment. In R.L. Brennan (Ed.), Educational measurement (4th ed., pp. 257–306). Washington, DC: American Council on Education. First citation in articleGoogle Scholar

  • Snow, R.E. , & Lohman, D.F. (1989). Implications of cognitive psychology for educational measurement. in R.L. Linn (Ed.), Educational measurement (3rd ed., pp. 263–331). New York: American Council on Education, Macmillian. First citation in articleGoogle Scholar

  • Tatsuoka, K.K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354. First citation in articleCrossrefGoogle Scholar

  • Tatsuoka, K.K. (1995). Architecture of knowledge structures and cognitive diagnosis: A statistical pattern recognition and classification approach. In P.D. Nichols, S.F. Chipman, R.L. Brennan (Eds.), Cognitively diagnostic assessment (pp. 327–359). Hillsdale, NJ: Erlbaum. First citation in articleGoogle Scholar

  • Yang, X , & Embretson, S.E. (2007). Construct validity and cognitive diagnostic assessment. In J.P. Leighton, M.J. Gierl (Eds.), Cognitive diagnostic assessment for education: Theory and practices (pp. 119–145). Cambridge, UK: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • van der Linden, W.J. , Glas, C.A.W. (2000). Computer adaptive testing: Theory and practice. Dordrecht, The Netherlands: Kluwer. First citation in articleCrossrefGoogle Scholar

  • Zhou, J. , Gierl, M.J. , Cui, Y. (2007, June). Computerized attribute-adaptive testing: A new computerized adaptive testing approach incorporating cognitive psychology. Paper presented at Graduate Management Admission Council (GMAC®) Conference on Computerized Adaptive Testing, Minneapolis, MN. First citation in articleGoogle Scholar