Skip to main content
Free AccessEditorial

Noncognitive Assessment in K12 Education

New Constructs and Approaches for the Twenty-First Century

Published Online:https://doi.org/10.1027/1015-5759/a000328

Perhaps because we have been working for so long in this domain, and see personal value in acknowledging it, none of us involved in the realization of this special issue would claim to be among the highest echelons of the cognitive elite. However, we suspect that despite our generally modest range of cognitive abilities, we – like so many of our reflective, academic colleagues, many of whom contributed to this special issue – have a range of personal attributes, skills, and competencies that provide opportunities, cause one to bunker down, and ultimately finish tasks that one has started in a thoughtful, collaborative, and congenial manner. These qualities are complex blends of several factors, with wonderful market facing labels: grit, growth mindset, resilience, teamwork, and a plethora of other personal qualities not yet widely discussed in education, but which psychological assessment has offered in advance and in abundance. These qualities are the motivation for this special issue. And while we suspect we will not have all the answers, the growing value of these qualities to the educational landscape explains why we were fortunate to be asked by Matthias Ziegler to pen this special issue.

Preamble: A Wider Context for Noncognitive Assessment in K12 Education

Realizing such an implicitly complex undertaking required careful deliberations, a keen attention to detail, and a clutch of scientists to offer new directions in measurement, methodologies, and practical scope. This special issue represents a call for evidence-centered accounts of the interface among personality assessment, educational applications, and policy-making that are based on decent sample sizes, best practices, and emerging (or sometimes re-emergent) theory. We delimited this further by focusing on K12 education (i.e., school-aged children and adolescents, ranging from kindergarten to the last year of high school), across theory, applications, and policy. Perhaps, we hear you say we are making a pitch: Another set of editors might consider doing this for tertiary education (i.e., community and technical colleges and universities across the globe). It can’t hurt!

Exposition: The Call for Manuscripts

In keeping with developments and increasing interest in this field, these were the broad topics we asked contributors to address:

1. New approaches to noncognitive skills assessment. Our intention here was twofold. Did any of our community have evidence for a new construct (or set) of constructs that might be of use in the educational context? Conceivably all such constructs are encapsulated by the Big Five and its various facets (see e.g., John & De Fruyt, 2015; MacCann & Roberts, 2010; Roberts, Martin, & Olaru, 2015), but this is not something our community should accept blindly. Second, were there new methodologies, beyond self- and peer-reports that might offer new insights into noncognitive skills? On this score we held much higher expectations, believing that approaches such as anchoring vignettes and situational judgment tests hold much promise, though the literature on these in K12 education is still relatively sparse (Lipnevich, MacCann, & Roberts, 2013).

2. Validity evidence supporting noncognitive measurement efforts. There has been a recent backlash both in the academic literature and popular press with noncognitive assessments. For example, a recent New York Times article directly quotes one prominent expert in the field as saying, “all measures suck, and they all suck in their own way” (see http://www.nytimes.com/2016/03/01/us/testing-for-joy-and-grit-schools-nationwide-push-to-measure-students-emotional-skills.html). We believe this is partly because due diligence has not been placed on the validation process, as well-established in the standards for educational and psychological testing (AERA/APA/NCME, 2014). We were not going to be party to a similar discordant note, and hence made validity evidence a major requirement for the contributors to consider. To this end, we asked the community to submit studies that conformed to the standards, though we were especially interested in those designed to assess test-criterion relationships between noncognitive skill assessments and valued outcomes (e.g., academic, quality of life, or health indicators); and those establishing the cross cultural comparability of noncognitive skills assessment. We also made it clear we were interested in studies that explored validity threats to noncognitive assessment.

3. Data showing the development of – or changes in – noncognitive skills, as measured by validated assessments. It has not been the case that careful attention has been given to how noncognitive skills develop over time. Fortunately, this has changed following the publication of a landmark set of studies by Professor Brent Roberts and colleagues (e.g., Roberts & DelVecchio, 2000; Roberts, Walton, & Viechtbauer, 2006). This part of our call represented a quasi-experiment with principled intent: To what extent did the readership of the European Journal of Psychological Assessment have these kinds of data, or were considering this important topic in their programs of research?

4. Interventions to improve noncognitive skills, provided these were also evaluated by validated assessments. Over the past two decades education has witnessed the emergence of a variety of programs designed to improve noncognitive skills in K12 education, most often under the label of social and emotional learning (SEL) programs. The preceding point above – that personality is subject to change – also implies that there may be systematic ways of influencing noncognitive skills, most likely at the facet level (MacCann, Duckworth, & Roberts, 2009). While meta-analytic evidence exists showing the value of SEL programs for a range of valued outcomes (e.g., Durlak, Weissberg, Dymnicki, Taylor, & Schellinger, 2011; Durlak, Weissberg, & Pachan, 2010) it is still often the case that these programs rely excessively on self-reported assessments, that are subject to a variety of biases and validity threats (e.g., Duckworth & Yeager, 2015). Accordingly, we sought to include papers that included in an evaluation study, assessments designed to get around these problems with either advanced methods or statistical approaches.

Coda: Introduction to Papers Comprising the Special Issue

In Table 1 below, we provide key features of accepted articles that went through a rigorous review process (many thanks also to readers of the journal who accepted invitations to take on this important activity). In this table, we provide identifying information for each article (i.e., authors and title), some comments on the sample studied, and both predictors and outcomes examined in each respective study.

Table 1 Summary of studies included in this special issue of the European Journal of Psychological Assessment

Some comments on how well we were able to balance “answering the call” with papers actually submitted follow. Because this is an assessment journal, all of the contributors were highly engaged in the process of establishing validity evidence for assessment scores, and a significant number explored new approaches beyond typical self-ratings (see especially articles 2–4 in Table 1). While we received no manuscripts that looked at the topic of interventions, we note that the act of taking a test is itself a form of intervention, and would encourage the reader to consider the implications of taking several of the papers and turning approaches into formative (rather than summative) assessments. Less clear is why we received no manuscripts in our call tackling the prospect of personality change (though the last article in this special issue examines how personality predicts changes in grade point average). Perhaps we remain too wedded as experts to the idea that “personality is set like plaster” (James, 1890/1981). Clearly evidence debunking this once established adage is now largely overwhelming (Lipnevich, Preckel, & Roberts, 2016; Roberts et al., 2006; Walton & Billera, 2016). Conceivably this too would make an excellent topic for a future special issue of the European Journal of Psychological Assessment!

Despite not quite meeting the lofty goals we set in our call there is much to recommend of this special issue, testifying to the imminent liveliness of the field. Contributors span four continents (Australia, Europe, North and South America); represent at least four disciplines (economics, education, psychology, and medicine); and if you care to explore ResearchGate or Google Scholar you will find several luminaries whose impact factor exceeds their chronological age (try it, this is no mean feat). As for the studies themselves, findings are based on a cumulative sample size of 159,937 students representing grades 5 through 12; with no fewer than 14 countries (Belgium, Brazil, Chile, Croatia, Germany, Hong Kong, Hungary, Italy, Korea, Luxembourg, Macao, Mexico, Portugal, and USA) referenced throughout. And while the range of predictors is quite large, there seems to be a growing realization that the Big Five represents a meaningful organizing framework from what was once a disjointed cacophony of constructs and measures. Equally, the outcome space appears rich, extending beyond the well-trodden (i.e., grades and test scores), to include graduation rates, sense of belonging, and counterproductive school behaviors.

We reserve our closing salvo for a brief word from, perhaps more correctly for, our sponsors: Policymakers and the student lives that this research affects. Our understanding of noncognitive skills influencing academic outcomes allows educators to identify students that are more, or in some cases less, likely to do well in specific academic programs. Further, knowledge of the relations among a range of noncognitive skills and valued outcomes can be used to develop meaningful assessment systems, suggest effective intervention strategies, and ultimately shape educational policy. Working hand in glove, interventions and policy can be used to enhance students’ noncognitive skills, and, consequently, their achievement, ability to graduate, and satisfaction with life. These interventions may also help to prepare students for meeting the demands of the future workforce and assist them in navigating through an ever-changing global economy (Burrus, Naemi, Mattern, & Roberts, in press). It is in service of these broad aims, to this constituency of which we have all belonged, that the current special issue of the European Journal of Psychological Assessment is devoted. We trust that the reader finds the articles that we have selected provocative, informative, and engaging. And if they stimulate further research – exploring noncognitive factors in education with still newer methodologies and better designs across still further countries – it would have served its purposes especially well. Enjoy!

The views expressed in this editorial are the authors’ and do not reflect the official opinions or policies of any of the authors’ host affiliations. We would like to thank Caspian Aicher-Roberts and Gabriel Olaru for their assistance in the preparation of this manuscript. Kudos also go out to Professor Matthias Ziegler, who provided us with this opportunity to compile this special issue on a topic that is near and dear to our hearts and minds.

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (2014). The standards for educational and psychological testing. Washington, DC: AERA/APA/NCME. First citation in articleGoogle Scholar

  • Burrus, J., Naemi, B., Mattern, K. & Roberts, R. D. (in press). Building better students: Preparation for life into the workforce. Cambridge, MA: Oxford University Press. First citation in articleGoogle Scholar

  • Duckworth, A. L. & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44, 237–251. First citation in articleCrossrefGoogle Scholar

  • Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D. & Schellinger, K. B. (2011). Enhancing students’ social and emotional development promotes success in school: Results of a meta-analysis. Child Development, 82, 405–432. First citation in articleCrossrefGoogle Scholar

  • Durlak, J. A., Weissberg, R. P. & Pachan, M. (2010). A meta-analysis of after-school programs that seek to promote personal and social skills in children and adolescents. American Journal of Community Psychology, 45, 294–309. First citation in articleCrossrefGoogle Scholar

  • James, W. (1890/1981). In F. BurkhardtEd., The principles of psychology (2 Vols.). Cambridge, MA: Harvard University Press. First citation in articleCrossrefGoogle Scholar

  • John, O. P. & De Fruyt, F. D. (2015). Framework for the Longitudinal Study of Social and Emotional Skills in Cities, Retrieved from http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=EDU/CERI/CD(2015)13&docLanguage=En First citation in articleGoogle Scholar

  • Lipnevich, A. A., MacCann, C. & Roberts, R. D. (2013). Assessing noncognitive constructs in education: A review of traditional and innovative approaches. In D. H. SaklofskeC. B. ReynoldsV. L. SchweanEds., Oxford handbook of child psychological assessment (pp. 750–772). Cambridge, MA: Oxford University Press. First citation in articleGoogle Scholar

  • Lipnevich, A. A., Preckel, F. & Roberts, R. D. (2016). Psychosocial constructs: Knowns, unknowns, and future directions. In A. A. LipnevichF. PreckelR. D. RobertsEds., Psychosocial skills and school systems in the 21st Century: Theory, research, and practice (pp. XX–YY). New York, NY: Springer. First citation in articleGoogle Scholar

  • MacCann, C., Duckworth, A. L. & Roberts, R. D. (2009). Empirical identification of the major facets of conscientiousness. Learning and Individual Differences, 19, 451–458. First citation in articleCrossrefGoogle Scholar

  • MacCann, C. & Roberts, R. D. (2010). Prediction of academic outcomes from time management, grit, and self-control: The pervasive influence of conscientiousness. In R. E. HicksEd., Personality and individual differences: Current directions (pp. 79–90). Brisbane, Queensland: Australian Academic Press. First citation in articleGoogle Scholar

  • Roberts, B. W. & DelVecchio, W. F. (2000). The rank-order consistency of personality traits from childhood to old age: A quantitative review of longitudinal studies. Psychological Bulletin, 126, 3–25. First citation in articleCrossrefGoogle Scholar

  • Roberts, B. W., Walton, K. E. & Viechtbauer, W. (2006). Patterns of mean-level change in personality traits across the life course: A meta-analysis of longitudinal studies. Psychological Bulletin, 132, 1–25. First citation in articleCrossrefGoogle Scholar

  • Roberts, R. D., Martin, J. & Olaru, G. (2015). A Rosetta Stone for noncognitive skills: Understanding, assessing, and enhancing noncognitive skills in Primary and Secondary Education. New York, NY: Asia Society and ProExam. First citation in articleGoogle Scholar

  • Walton, K. E. & Billera, K. A. (2016). Personality development during the school-aged years: Implications for theory, research and practice. In A. A. LipnevichF. PreckelR. D. RobertsEds., Psychosocial skills and school systems in the 21st Century: Theory, research, and practice. New York, NY: Springer. First citation in articleGoogle Scholar

Kevin Petway, Educational Testing Service, 660 Rosedale Road, MS 18-E, Princeton, NJ 08541, USA, E-mail
Veleka Allen, Otsuka Pharmaceuticals, 508 Carnegie Center Drive, Princeton, NJ 08540, USA, E-mail
Richard D. Roberts, ProExam Center for Innovative Assessments, 475 Riverside Drive, Suite 600, New York, NY 10115, USA, E-mail