Skip to main content
Free AccessEditorial

Introducing New Open Science Practices at EJPA

Published Online:https://doi.org/10.1027/1015-5759/a000628

What Is Open Science and Why Is It Important?

The Open Science movement aims to make scientific research open to all levels of society (Woelfle et al., 2011). It involves practicing science in a manner that allows others to collaborate and contribute and that makes research data and materials freely available under terms that enable reuse, redistribution, and reproduction of the research and its underlying data and methods (Bezjak et al., 2018). While the movement is by no means new (Hesse, 2018), it was reignited by several (partly unsuccessful) attempts to replicate and reproduce research findings in psychological and behavioral science (e.g., Doyen et al., 2012; Hagger et al., 2016; Open Science Collaboration, 2012, 2015). In response to what is sometimes called the “replication crisis,” researchers were urged to exercise greater transparency in how they conduct, report, and disseminate their results to improve the reliability, efficiency, and credibility of scientific research (Chambers et al., 2014; Ioannidis, 2014; Munafò et al., 2017; Nosek et al., 2015). The successful adoption of such open science practices is, however, dependent on a coordinated effort from multiple stakeholders, including journals, funders, institutions, and researchers alike (Munafò et al., 2017). As such, we will be introducing new measures to encourage open, transparent, and reproducible science at EJPA.

Open Science at EJPA: Introducing Two New Initiatives

As one of the leading outlets in the field of psychological assessment, EJPA’s goal is to publish cutting-edge research reporting on the construction of new measures or advancements on existing measures in all domains of psychology. We are cognizant of the fact that our field is not immune to the threats posed by the replication crisis and, more importantly, that we have a responsibility to address it as best we can. In 2017, EJPA took the first steps toward increasing openness, transparency, and accountability by making the submission of the inputs and outputs of analyses from statistical software packages mandatory (latest upon acceptance) for all submission types (Greiff, 2017). A year later, in 2018, EJPA introduced the registered report (RR) format (see Greiff & Allen, 2018 for a discussion of the benefits of RRs).1 Now, as part of our ongoing process of improving the quality of science in an open and transparent way, we are excited to announce two new open science initiatives: (1) The adoption of Transparency and Openness Promotion (TOP) standards and (2) the awarding of badges for open practices. These practices take effect as of now and apply to all current and future manuscripts submitted to EJPA.

Transparency and Openness Promotion Guidelines at EJPA

The TOP Guidelines are a set of eight modular standards that aim to move scientific communication to greater levels of transparency (Nosek et al., 2015). Journals can choose which of the eight transparency standards they wish to implement and also select a level of implementation for each standard (i.e., disclose, require, or verify). EJPA has, together with more than 5,000 organizations and journals across disciplines, become a TOP signatory (see https://www.cos.io/our-services/top-guidelines for a list of signatories) to express our support for the principles of openness, transparency, and reproducibility (see Table 1). In addition, we have obtained a TOP Factor (see https://www.topfactor.org/ for the list of TOP scores), which provides a metric of the steps we are taking as a journal to implement open science practices. We not only strongly encourage authors to implement open science practices, but we have implemented the TOP Guidelines in a manner that allows authors to exercise some degrees of freedom in how they embrace and incorporate these guidelines. Regardless of how authors choose to implement these standards, we require them to be transparent about their choices.

Table 1 Summary of EJPA’s TOP Requirements

Table 1 provides an overview of the TOP standards and the concomitant level adopted by EJPA. EJPA requires disclosure (Level 1) for the data transparency (TOP#2), analytic methods transparency (TOP#3), research materials transparency (TOP#4), study preregistration (TOP#6), and analysis plan preregistration (TOP#7) standards. Regarding the citation (TOP#1) standard and the design and analysis standard (TOP#5), authors will, however, be obliged (Level 2) to cite existing data and materials (if used) and to report specific information regarding the design and analysis of their study. Lastly, EJPA encourages replications and offers Registered Reports as a regular submission option, thereby representing a Level 3 on the replication standard (TOP#8).

What Does the Implementation of TOP Guidelines Mean for Authors in Practice?

The implementation of the TOP Guidelines predominantly requires authors to disclose or provide mandatory information pertaining to the various standards in their manuscript. To assist authors with the implementation of the TOP standards, we have included a set of questions related to each TOP standard as part of the submission process. These questions will serve as a checklist for authors and aid compliance with the various standards. In addition, to reduce the administrative load associated with the implementation of the standards, authors will only be required to complete these questions when submitting their revised manuscript after review. More information and a detailed submission checklist can be found in the author guidelines on EJPA’s webpage.

New Open Data and Open Materials Badges and Changes to Our Electronic Supplementary Material

Together with the implementation of the TOP Guidelines, we have implemented Open Data and Open Materials badges to recognize and reward authors for the adoption of open practices. The Open Data (OD) badge is earned by making publicly available the digitally shareable raw (and processed) data (including a codebook) needed to reproduce the reported results (see the practical editorial in this issue by Horstmann et al., 2020 on how to produce reproducible data sets and codebooks). The Open Materials (OM) badge is earned by making publicly available the components of the research methodology needed to reproduce the reported procedure and analysis (e.g., questionnaires; the scripts, code, and outputs associated with the statistical analyses; any additional analyses or explanations). Apart from awarding badges, EJPA further supports open practices by now requiring all other electronic supplementary files (e.g., supplementary tables and figures) to be uploaded in a freely accessible public repository. EJPA will thus no longer publish such files online as Electronic Supplementary Material (ESM) alongside the article. This means that all supplemental information should be uploaded on a freely accessible public repository.

What Do the New Open Data, Open Materials, and Electronic Supplementary Materials Mean for Authors in Practice?

Although EJPA strongly encourages the sharing of data and materials, authors ultimately decide whether and to what extent they want to share data and/or materials. Authors who decide to share data, materials, or other supplemental information, are required to upload all Open Data, Open Materials, and other electronic supplementary files in a freely accessible public repository and cite it correctly (TOP#1). A preferred repository for EJPA is PsychArchives (run by a public open science institute for psychology https://www.psycharchives.org/), but other repositories are, of course, also acceptable as long as DOIs or accession numbers are provided and the data are at least as open as CC BY (for a list of repositories see http://www.re3data.org/). Authors are encouraged to provide OD, OM, and other electronic supplementary at the point of submission. However, to ease the administrative burden, authors will be allowed to submit the data and/or materials after the first review. These authors must, at the very latest, submit the information when the manuscript is accepted for publication. Those authors who currently have manuscripts under review are also eligible to apply for an OD/OM badge. Furthermore, since EJPA has opted for a disclosure badging system, authors are required to include a public statement stating that there is sufficient information for an independent researcher to reproduce all of the reported results (i.e., OD) and/or all of the reported methodological procedures and analyses (i.e., OM). The Editorial Office will perform cursory checks to ensure the provided links are working, but authors will ultimately be accountable to the community for disclosure accuracy and are fully responsible for the content of the Open Data and Open Materials. Here, it is important to emphasize that EJPA will only award an OD and/or OM badge if the complete set of data and/or materials needed to reproduce the reported results and/or procedure and analysis are made publicly available. For example, if authors of an empirical paper upload only additional tables and figures in a free, public repository (but omit the syntax and outputs used for the analyses), this would be classified as other electronic supplementary materials as the materials that are needed to reproduce the reported procedure and analysis are not available in the repository.

Even though compliance with EJPA’s TOP standards will henceforth be mandatory, we realize that there may be instances where authors are unable to make the full dataset or all relevant and core materials publicly available. In these situations, we encourage authors to consider whether it might be possible to make at least parts of the data and/or material publicly available as other electronic supplementary materials. Importantly, if authors are unable to share data and/or material in a public repository, they, nevertheless, need to include a statement to this effect (as per TOP#2, #3, and #4) and, preferably, explain why the data and/or material cannot be made publicly available. In sum, while EJPA will always require authors to disclose whether (or not) they make data and/or materials publicly available, authors will not be penalized for not providing OD, OM, or other electronic supplementary materials and this will not have an impact on the peer review process.

To make the OD and OM badge application process easier for authors, we have included a set of questions related to OD and OM badges as part of the submission process. Although early submission of data, materials, and other electronic supplementary material is encouraged, authors can apply for a badge up until the point of acceptance. More information and a detailed checklist can be found in the author guidelines on EJPA’s webpage.

The Next Steps for Open Science at EJPA

Toward the end of 2018, EJPA was the first assessment journal to introduce the RR format and we recently published our first RR (Kember & Williams, 2020). Now, two years later, we join other top scientific journals in the adoption of TOP standards and the awarding of OD and OM badges. Again, in the field of psychological assessment, EJPA is spearheading these developments. In this, the editorial team at EJPA now joins the global movement aiming to increase openness and transparency in science. It is important to note that while other fields of psychology have been more openly criticized for a lack of openness and transparency, the issues associated with a lack of openness and transparency do not stop at the doorsteps of psychological assessments, but might – on the background that assessment is a part of any empirical effort in psychology (for a more thorough discussion see, for example, Flake & Fried, 2019) – be prevalent and highly relevant for authors and readers of EJPA as well as for the entire community of psychological assessment. With this in the back of our minds, we expect the efforts that we now take to, in the long run, lead to an overall higher quality level of scientific knowledge in general and psychological assessment in particular. Thus, the implementation of these practices is part of our continuing strategy to increase transparency and accountability at EJPA, and also in the broader field of psychological assessment. In the near future, we also plan to implement and award badges for preregistration. As stakeholders of science, we trust that you will embrace these open science practices with us to ensure the reliability, efficiency, and credibility of psychological assessment science and the scientific process itself.

1Keep an eye out for our upcoming special issue, Advancing the Reproducibility of Psychological Assessment Across Borders and Populations, which is dedicated to RRs aimed at developing novel tools and adapting/testing existing measures in novel settings.

References

  • Bezjak, S., Clyburne-Sherin, A., Conzett, P., Fernandes, P., Görögh, E., Helbig, K., Kramer, B., Labastida, I., Niemeyer, K., Psomopoulos, F., Ross-Hellauer, T., Schneider, R., Tennant, J., Verbakel, E., Brinken, H., & Heller, L. (2018). Open science training handbook, Zenodo. https://doi.org/10.5281/ZENODO.1212496 First citation in articleGoogle Scholar

  • Chambers, C. D., Muthukumaraswamy, S. D., & Etchells, P. J. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1(1), 4–17. https://doi.org/10.3934/Neuroscience.2014.1.4 First citation in articleCrossrefGoogle Scholar

  • Doyen, S., Klein, O., Pichon, C.-L., & Cleeremans, A. (2012). Behavioral priming: It’s all in the mind, but whose mind? PLoS One, 7(1), Article e29081. https://doi.org/10.1371/journal.pone.0029081 First citation in articleCrossrefGoogle Scholar

  • Greiff, S. (2017). The field of psychological assessment: Where it stands and where it’s going – A personal analysis of foci, gaps, and implications for EJPA. European Journal of Psychological Assessment, 33(1), 1–4. https://doi.org/10.1027/1015-5759/a000412 First citation in articleLinkGoogle Scholar

  • Greiff, S., & Allen, M. S. (2018). EJPA introduces registered reports as new submission format. European Journal of Psychological Assessment, 34(4), 217–219. https://doi.org/10.1027/1015-5759/a000492 First citation in articleLinkGoogle Scholar

  • Hagger, M. S., Chatzisarantis, N. L. D., Alberts, H., Anggono, C. O., Batailler, C., Birt, A. R., Brand, R., Brandt, M. J., Brewer, G., Bruyneel, S., Calvillo, D. P., Campbell, W. K., Cannon, P. R., Carlucci, M., Carruth, N. P., Cheung, T., Crowell, A., de Ridder, D. T. D., Dewitte, S., … Zwienenberg, M. (2016). A multilab preregistered replication of the ego-depletion effect. Perspectives on Psychological Science, 11(4), 546–573. https://doi.org/10.1177/1745691616652873 First citation in articleCrossrefGoogle Scholar

  • Hesse, B. W. (2018). Can psychology walk the walk of open science? The American Psychologist, 73(2), 126–137. https://doi.org/10.1037/amp0000197 First citation in articleCrossrefGoogle Scholar

  • Horstmann, K. T., Arslan, R. C., & Greiff, S. (2020). Generating codebooks to ensure the independent use of research data: Some guidelines. European Journal of Psychological Assessment, 36(5), 721–729. https://doi.org/10.1027/1015-5759/a000620 First citation in articleAbstractGoogle Scholar

  • Flake, J. K., & Fried, E. I. (2019). Measurement schmeasurement: Questionable measurement practices and how to avoid them. PsyArXiv, https://doi.org/10.31234/osf.io/hs7wm First citation in articleGoogle Scholar

  • Ioannidis, J. P. A. (2014). How to make more published research true. PLoS Medicine, 11(10), Article e1001747. https://doi.org/10.1371/journal.pmed.1001747 First citation in articleCrossrefGoogle Scholar

  • Kember, S. M., & Williams, M. N. (2020). Autism in Aotearoa: Is the RAADS-14 a valid tool for a New Zealand population? European Journal of Psychological Assessment, , 1–11. https://doi.org/10.1027/1015-5759/a000598 First citation in articleLinkGoogle Scholar

  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Du Percie Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), Article 21. https://doi.org/10.1038/s41562-016-0021 First citation in articleCrossrefGoogle Scholar

  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration. (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7(6), 657–660. https://doi.org/10.1177/1745691612462588 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716 First citation in articleCrossrefGoogle Scholar

  • Woelfle, M., Olliaro, P., & Todd, M. H. (2011). Open science is a research accelerator. Nature Chemistry, 3(10), 745–748. https://doi.org/10.1038/nchem.1149 First citation in articleCrossrefGoogle Scholar

Samuel Greiff, Institute of Cognitive Science and Assessment (COSA), University of Luxembourg, 2, avenue de l’Université, 4365 Esch sur Alzette, Luxembourg, E-mail