Abstract
Abstract. Critical thinking is a broad term that includes core elements such as reasoning, evaluating, and metacognition that should be transferred to students in educational systems. The integration of such skills into models of student success is increasing on an international scale. The Cornell Critical Thinking Test is an internationally used tool to assess critical thinking skills. However, limited validity evidence of the translated versions of the instrument exists to support the inferences based on the CCTT scores. This study examined the CCTT Turkish version. Specifically, translated items were examined for measurement equivalence by determining if items function differently across students from United States and Turkey. Differential Item Functioning (DIF) analysis via logistic regression was employed. Results demonstrated that each subtest contained DIF items and 10% of the items in the instrument were identified as DIF. Mean differences between students in each country were not influenced by these items. A critical content review of the translated item gave insight as to why items may be functioning differently.
References
1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement, 36, 185–198.
(1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
. (2000). Guidelines for the process of cross-cultural adaptation of self report measures. Spine, 25, 3186–3191.
(1985). Critical thinking: What is it? Social Education, 49, 270–276.
(2006). When does measurement invariance matter? Medical Care, 44, 176–181.
(1994). Methods for identifying biased tests items. Thousand Oaks, CA: Sage.
(2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century US workforce. Partnership for 21st Century Skills. 1 Massachusetts Avenue NW Suite 700, Washington, DC 20001.
(1993). Egitim sisteminin demokratiklestirilmesi
([Democratizing the educational system] . Cagdas Egitim, 184, 12–14.2011). Are cross-cultural comparisons of personality profiles meaningful? Differential item and facet functioning in the Revised NEO Personality Inventory. Journal of Personality and Social Psychology, 101, 1068–1089.
(1993). Watson-Glaser eleştirel akıl yürütme gücü ölçeğinin (form Ym) lise öğrencileri üzerindeki ön deneme uygulaması
([A pilot study of the Watson-Glaser Critical Thinking Appraisal Test (form Ym) with high school students] . Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 25, 559–569.2000). Assessing differential functioning in a satisfaction scale. Journal of Applied Psychology, 85, 451–461.
(1991). Teaching thinking skills: School improvement research series, Retrieved from http://www.nwrel.org/scpd/sirs/6/cu11ktml
(1992). Teaching critical thinking in the arts and humanities. Milwaukee, WI: Alverno Productions.
(2010). Type I error inflation for detecting DIF in the presence of impact. Educational and Psychological Measurement, 70, 961–972.
(1933). How we think: A restatement of the relation of reflective thinking to the educative process. Boston, MA: D.C. Heath.
(1962). A concept of critical thinking. Harvard Educational Review, 32, 81–111.
(1964). Operational definitions. American Educational Research Journal, 1, 183–201. (Reprinted in L. I. Krimerman (Ed.). (1969). The nature and scope of social science: A critical anthology (pp. 431–444). New York: Appleton-Century-Crofts (Title as reprinted: Operationism can and should be divorced from covering law assumptions.)
(1981). A conception of deductive logic competence. Teaching Philosophy, 4, 337–385.
(1985). A logical basis for measuring critical thinking skills. Educational Leadership, 43, 44–48.
(1987).
(A taxonomy of critical thinking dispositions and abilities . In J. BaronR. SternbergEds., Teaching thinking skills: Theory and practice (pp. 9–26). New York, NY: W.H. Freeman.1993). Critical thinking assessment. Theory Into Practice, 32, 179–186.
(1996). Critical thinking dispositions: Their nature and assessability. Informal Logic, 18, 165–182.
(2005). Cornell critical thinking test, level X (5th ed.). Seaside, CA: The Critical Thinking Company.
(2005). Cornell Critical Thinking Tests. Seaside, CA: The Critical Thinking Co.
(2012). Investigation of Cornell critical thinking results as affected by science writing heuristic, Paper presented at The European Association for Research on Learning and Instruction, Special Interest Group 18 (Educational Effectiveness), Zurich, Switzerland
(2012). Detection of sex differential item functioning in the Cornell Critical Thinking Test. European Journal of Psychological Assessment, 28, 201–207. doi: 10.1027/1015-5759/a000127
(2007, April). The influence of differential item functioning on multi-sample confirmatory factor analysis. Paper presented at the National Council on Measurement in Education conference, Chicago, IL.
(2007). An examination of the first/second-grade form of the pictorial scale of perceived competence and social acceptance: Factor structure and stability by grade and gender across groups of economically disadvantaged children. Journal of School Psychology, 45, 311–331.
(1994). Cross-cultural normative assessment: Translation and adaptation issues influencing the normative interpretation of assessment instruments. Psychological assessment, 6, 304.
(1994). Guidelines for adapting educational and psychological tests: A progress report. European Journal of Psychological Assessment (Bulletin of the International Test Commission), 10, 229–244.
(1994). Comparison of empirical and judgmental procedures for detecting differential item functioning. Educational Research Quarterly, 18, 21–36.
(1998). How seductive details do their damage: A theory of cognitive interest in science learning. Journal of Educational Psychology, 90, 414–434.
(2004). Critical thinking: An introduction to the basic skills. Canada: Broadview Press.
(1998). Critical thinking: An overview. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved from http://www.edpsycinteractive.org/topics/cognition/critthnk.html
(2010). International Test Commission guidelines for translating and adapting tests. Retrieved from http://www.intestcom.org
. (2001). Evaluating type I error and power rates using an effect size measure with the logistic regression procedure for DIF detection. Applied Measurement in Education, 14, 329–349.
(2011). The effect of a reading framework in implementation of the Science Writing Heuristic (SWH) approach. Paper presented at the European Science Education Research Association conference, Lyon, France.
(2009). Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking skills and creativity, 4, 70–76.
(1992). The future of thinking. London, UK: Routledge.
(2008). Evaluation of Halpern’s “Structural Component” for improving critical thinking. The Spanish Journal of Psychology, 11, 266–274.
(2010). PISA 2009 results, What makes a school successful? resources, policies and practices (Volume IV). Paris, France: OECD Publishing.
. (2012). PISA 2012 Assessment and Analytical Framework, Mathematics, Reading, Science, Problem Solving and Financial Literacy. Paris: OECD Publishing.
. (2011). Measuring financial literacy: Core questionnaire in measuring financial literacy: Questionnaire and guidance notes for conducting Internationally comparable survey of financial literacy. Paris, France: OECD Publishing.
. (2006). Program for international student assessment. Retrieved from http://www.pisa.oecd.org/document
. (1995). Critical thinking: How to prepare students for a rapidly changing world. Santa Rose, CA: Foundation for Critical Thinking.
(2000). Critical thinking in education: A review. Educational Research, 42, 237–249.
(2005). Elestirel dusunme
([Critical thinking] . In O. DemirelEd., Egitimde yeni yonelimler (123–136). Ankara, Turkey: Pegem A Yayıncılık.1990).
(McPeck, informal logic, and the nature of critical thinking . In J. McPeckEd., Teaching Critical Thinking (pp. 75–85). New York, NY: Routledge.2003). Appraising item equivalence across multiple languages and cultures. Language Testing, 20, 148–166.
(2002). National education at the beginning of 2002. Retrieved from http://www.meb.gov.tr/Stats/apk2002ing/apage00_0.htm
. (1997). Bias and equivalence in cross-cultural assessment: An overview. European Review of Applied Psychology, 47, 263–280.
(2000). Problems and useful techniques in teaching argumentation, informal logic and critical thinking. Informal Logic 20, Teaching Supplement, 35–39.
(1999). A handbook on the theory and methods of differential Item functioning (DIF): Logistic regression modeling as a unitary framework for binary and Likert-type (ordinal) item scores. Ottawa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.
(