Skip to main content
Open AccessEditorial

Open Science in Suicide Research Is Open for Business

Published Online:https://doi.org/10.1027/0227-5910/a000859

Suicide claims more than 700,000 lives globally every year (World Health Organization, 2021) and affects approximately 135 people per individual who dies by suicide (Cerel et al., 2019). Those affected by suicide – from people with lived experience to policy-makers – are depending on researchers to provide reliable evidence: a prerequisite of effective prevention and treatment. However, not all evidence is equal; studies with small sample sizes may produce spurious results (Carpenter & Law, 2021) and measures may be unable to capture suicidal thoughts and behaviors in a reliable and valid way (Millner et al., 2020), which can compromise the generalizability of findings.

The quality of the research methods used to generate evidence is the key to determining the credibility we afford it (Vazire et al., 2021). Although we have undoubtedly made progress over the years in our understanding of suicide, recent research does not appear to have built upon previous work to the extent it could have done – mostly because of major methodological limitations in suicide research and publication bias limiting insights into the full range of existing findings (Franklin et al., 2017; Pirkis, 2020).

To build on what has come before us, we need to be able to see what we are building on. Beyond unpublished null-findings, there are many other reasons the evidence base is incomplete. Journal word limits may preclude sufficiently detailed descriptions of methods and statistical analysis to enable replication, abandoned research questions and analysis plans may not be reported as they make for a messier story, or after a long period of data collection, the original hypotheses and analysis plans may have become hazy, or could have changed based on knowledge of the data.

How can we strengthen the foundations of our evidence base for the future and in doing so, “future-proof” suicide research? We can take active steps to tackle the problematic research practices described earlier, which threaten transparency (openness about the research process), reproducibility (obtaining the same results again using the same data), and replicability (obtaining similar results with identical methods in new studies) of research. Open science practices, including registration of hypotheses and analytic plans before data collection (preregistration) and sharing analytic code and materials, can help to address research practices that may threaten the transparency, reproducibility, and replicability of research (Munafò et al., 2017).

Conversations about transparency, reproducibility, and replicability have just begun to blossom in clinical psychology and psychiatry research (Tackett et al., 2017, 2019), and have only recently begun to open up formally in suicide research (Carpenter & Law, 2021). Following a proposal by the International Association for Suicide Prevention (IASP) Early Career Group, Crisis recently adopted the Registered Reports (RRs) article format (Pirkis, 2020); Carpenter and Law (2021) published an introduction to open science for suicide researchers; and the authors of the current editorial presented a symposium on open science practices at the 2021 IASP World Congress.

In this editorial, we use examples from our and others’ work to demonstrate the opportunities for future-proofing research by implementing open science practices, and we discuss some of the challenges and their potential solutions. We cover implementing open science practices in new, ongoing, and concluded studies, and discuss practices in order of being “low” to “high” threshold to implement (based on Kathawalla et al., 2021). Space constraints preclude us from covering all open science practices and there are undoubtedly more researchers using open science practices in suicide research than we are aware of and whose work we have included here. To highlight the open science work of as many researchers as possible, we have sometimes provided examples in Electronic Supplementary Material 1 (ESM 1) rather than in the text. We hope readers will help us add to these examples via our “living” reading list (https://osf.io/v6y3t/). Readers interested in a broad overview of open science practices are directed to the work of Carpenter and Law (2021), Kathawalla et al. (2021), and Tackett et al. (2019).

Implementing Open Science Practices Into New Studies

In this section, we describe open science practices that researchers can implement when starting new studies – an excellent time to introduce open science practices into the research workflow. When implementing an open science practice for the first time in a new study, it is helpful to weigh the learning curve of acquiring new skills inherent to a study (e.g., new analytic techniques) against the complexity of the open science practice to be implemented. Open science practices are skills that develop over time (Nosek et al., 2019), and we recommended introducing one practice per new project to build these practices up into a full open science repertoire (Quintana, 2020a).

The first and last authors’ initial foray into open science was a full preregistration – a relatively advanced open science practice – that, combined with learning new statistical software and techniques, made for a challenging initiation into open science. However, there are also benefits to adopting multiple open science practices simultaneously, especially for new PhD students, such as the second author; using open science practices was simply how she was trained to conduct research, as opposed to them being “special” practices to adopt.

Preprints

A growing number of suicide researchers (e.g., Coppersmith et al., 2020; DelPozo-Banos et al., 2021; Kaurin, Wright, Hallquist et al., 2020; O’Connor et al., 2018) post a version of their manuscript online, prior to peer review, known as a preprint (commonly posted on PsyArXiv for psychology, SocArXiv for sociology, MedArXiv for clinical articles, etc.). This is because preprints increase access to research beyond barriers of publishers’ paywalls and the successive upload of revised manuscript versions transparently shows the evolution of an article. Moreover, sharing work ahead of peer review on Twitter aids development of professional networks (especially for early career researchers [ECRs]), facilitates collaboration, and encourages offers of support and advice from the broader scientific community. Preprints automatically receive a digital object identifier (DOI) and are therefore citable. This can be particularly useful for ECRs, who need to demonstrate output to progress committees and grant application panels, but could otherwise wait months or even years for papers to be published in a journal. Although most journals allow researchers to submit preprinted manuscripts, we advise researchers to check individual journals’ preprint policies on the journal website or using the ROMEO/SHERPA database (https://v2.sherpa.ac.uk/romeo/). For a comprehensive guide to preprinting, see Moshontz et al. (2021).

Sharing Study Materials and Code

Descriptions of the materials and analysis code provide insights into the inner workings of a study, but there are few substitutes for seeing the actual documents. Examples of materials that can be shared include questionnaires (Holman & Williams, 2019; Robinson & Wilson, 2020), interventions (Dobias et al., 2021), and analysis code (Kaurin, Wright, Dombrovski et al., 2020; Kothgassner et al., 2020). For example, researchers using experience sampling methods can make their experience sampling questionnaires open via the Experience Sampling Method Item Repository (Kirtley, Hiekkaranta et al., 2019; https://esmitemrepository.com). Beyond enabling others to gain a greater understanding of the research process, sharing study elements also enables reuse of materials and code (with permission and attribution), preventing resource wastage. It also means study materials or analytic approaches are citable. It may not be possible to share proprietary questionnaires or stimuli, but we recommend that researchers share whatever they can, and use nonproprietary materials where possible. Sharing code – and data, which we discuss later – also facilitates rigorous peer-review, by enabling reviewers to examine in detail or even reproduce analyses, as experienced by the first author during peer review of a recent paper (Kirtley, Hussey et al., 2021).

Preregistration

In the process of designing and conducting a study, every decision is a “forking path” (Gelman & Loken, 2013) that takes the study in a different direction. Often, many of these decisions are not reported in the final manuscript, which hampers replicability. Knowledge about data may also endanger replicability when it influences our hypotheses and (unregistered) analysis plans, leading to data-dependent decision-making and, at worst, questionable research practices. Examples of questionable research practices include hypothesizing after the results are known (HARKing), running many different statistical tests until statistical significance is achieved (p-hacking), and selective reporting of results (cherry-picking).

To avoid these pitfalls and instead increase transparency and replicability, researchers can create a preregistration for their study: a time-stamped, uneditable plan for a study’s research questions, hypotheses, and analyses, made before data are collected/accessed and analyzed (Nosek et al., 2018). Some researchers may already be familiar with the concept of preregistration from registration of clinical trials or systematic reviews/meta-analyses. Commonly, nontrial research is preregistered on the Open Science Framework (OSF) website (https://osf.io/prereg/), where researchers have the option of using a general template or several specific templates depending on the nature of their research, for example, using preexisting data (Mertens & Krypotos, 2019; van den Akker et al., 2019), functional magnetic resonance imaging (fMRI) (Beyer et al., 2021), cognitive modeling (Cruwell & Evans, 2021), experience sampling (Kirtley, Lafit et al., 2021), or qualitative methods (Hartman et al., 2019). For examples of preregistration in the suicide and self-injury field, see Holman and Williams (2019); Knipe and Rajapakse (2021); Robinson and Wilson (2020); Warne et al. (2020); Dobias et al. (2021); Kaurin, Wright, Hisler et al. (2020). Preregistration is also flexible, as even with careful forethought, unexpected issues can arise that require changes to the recruitment procedure or analysis plan. Such changes should be documented rigorously, for example, in a transparent changes document (Mellor et al., 2019).

A common concern about preregistration is that it can slow progress when lengthy ethical approval and recruitment processes already cause time pressure. Our first preregistrations were slow, but we completed subsequent preregistrations faster as our skills improved. This “front-loading” of effort meant we considered our analysis plan in much more detail before accessing data, including writing analytic code in advance, which ultimately sped up the analysis and write-up of the manuscript because we had been able to anticipate some of the challenges we would face and to develop contingency plans. Again, a preregistration on the OSF receives a DOI, meaning that this rich and detailed plan for a study can be shared, for example, with other researchers to encourage replication or with grant review panels to demonstrate ongoing work.

Registered Reports

For RRs, researchers write the introduction and method sections of a manuscript, including a full analysis plan (Stage 1), which is peer reviewed before data collection or analysis. When a Stage 1 manuscript receives in-principle acceptance, it is stored in an online repository, and the journal commits to publishing the full manuscript based on the rationale, hypotheses, and quality of the analysis plan. After data collection and analysis, the full Stage 2 manuscript is peer reviewed. Subject to reviewers’ evaluation of whether researchers adequately adhered to the Stage 1 plan, the full manuscript is published regardless of the directionality and statistical significance of the results (Chambers & Tzavella, 2021).

More than 300 journals now offer the RR format (Chambers & Tzavella, 2021) and Crisis recently became the first specialist suicide research journal to offer RRs (Pirkis, 2020). Emerging research on RRs suggests they outperform “traditional” article types on various criteria, including quality and rigor of methodology and analysis, novelty, and creativity (Soderberg et al., 2021). In addition to the benefits of preregistration (e.g., guarding against questionable research practices, and increasing transparency), greater use of RRs in suicide research could reduce publication bias. This will reduce resource wastage, as suicide researchers may otherwise have spent time and funding trying to replicate published effects that were in fact spurious. Resources will also be saved by predata collection/analysis peer review, when suboptimal methodological and analytic choices can still be addressed.

The second author’s first paper was an RR using preexisting data (Janssens et al., 2021). Because Stage 1 RRs are accepted based on the quality and value of the research questions and analysis plan, this encouraged her to carefully consider theory when building her rationale and to devote considerable time to optimizing her analysis plan. As a novice researcher, she found the two-step approach an invaluable learning process, which increased her confidence when eventually analyzing the data and interpreting the results. The peer review process also felt more collaborative, rather than adversarial, and eliminated the worry that if results did not support her hypotheses, it would reduce likelihood of publication.

An important concern for researchers when considering the RR format is the impact on project timelines: If data collection cannot begin until after Stage 1 acceptance – and in some cases, approval of ethical amendments due to protocol changes arising from Stage 1 peer review – this can slow down the research (Chambers & Tzavella, 2021). We suggest that an RR is something to be planned into a project from the outset, when the potential delays to data collection can be factored into the overall project timeline. More experienced researchers who are new to open science may find it easier to first attempt a full preregistration, before embarking upon their first RR, as this will enable them to build the necessary skills and to have more control over the timeline. For ECRs, an RR may be a good place to start in developing their open science repertoire – especially when supervised by a mentor with some experience in preregistration – but possible delays during the RR process should not negatively impact an ECR’s progression. In the second author’s case, she worked on the RR simultaneously with another paper that was not dependent on the results of the RR, reducing pressure for the RR process to move rapidly. For RRs using preexisting data, there are also additional considerations regarding controlling data access to reduce the likelihood of data-dependent decision-making. See Kirtley (2022) for a discussion of this issue.

Sharing Data

Researchers are increasingly encouraged (and in some cases required by funders and journals) to share their data, by storing them in public repositories, such as the OSF, or a restricted access repository (e.g., the Harvard Dataverse: https://dataverse.harvard.edu/). Sharing data facilitates verification, increases the trustworthiness of results, and aids collaborative efforts such as meta-analytic work. Providing codebooks and metadata also helps to ensure transparency and reproducibility (Weston et al., 2019). Concerns about data sharing and potential solutions have been covered elsewhere (see Simons, 2018).

Increasing numbers of suicide researchers are choosing to share their data in the interests of transparency, reproducibility, and potential reuse (Holman & Williams, 2019; Knipe et al., 2021; Millner, 2016; Robinson & Wilson, 2020), including from meta-analyses (Kothgassner et al., 2020) and narrative reviews (Kirtley, Rodham et al., 2019). Sharing data for reuse can also facilitate collaboration and research synergy, for example, the sharing of real-time suicide data from the early phase of the COVID-19 pandemic (Pirkis et al., 2021). Sharing data can also inform conceptual choices in future research. There is a surge in experience sampling studies and there are many unknowns that impact the way we design our studies (Kirtley, Lafit et al., 2021) to capture the everyday lives of people at risk for suicide. For instance, little research has examined the “true” timescale of an emergent suicidal crisis, and it is unlikely that the length of a suicidal episode is the same between or even within individuals. When knowledge is scarce, collaboration is key to developing theoretically principled guidelines for designing future studies. Sharing data might aid the comparison of key parameters and consideration of how these are affected by sampling choices and facilitate multisite collaborations (as recently initiated by the last author).

Implementing Open Science Practices Into Ongoing and Concluded Studies

Although starting to implement open science practices at the outset of the study may be the optimal scenario, we urge researchers not to feel as though they have missed the open science “boat” because their study has already commenced or concluded.

Researchers can self-archive the postprint of their accepted journal article as a low-threshold way of “opening up” concluded studies. Materials and code can be shared retrospectively, although for established research groups, this may be something that occurs gradually over time. Deidentified data from concluded studies can also be shared, providing that consent for data sharing was obtained from participants. See Soderberg et al. (2019) for examples of institutional review board (IRB) and consent form text for data sharing. If such consent was not obtained, researchers may be able to share a synthetic version of the dataset to facilitate transparency and reproducibility (Kirtley, Hussey et al., 2021; Quintana, 2020b; Sandford et al., 2021). Where researchers have not accessed the data in an ongoing or concluded study, they can postregister their study (Benning et al., 2019). Additionally, some journals offer RRs for ongoing and concluded (i.e., pre-existing data) studies, where researchers can prove they have not accessed the data (e.g., with a signed statement from a data manager). See Kirtley (2022), Kirtley, Lafit et al. (2021), and Weston et al. (2019) for a further discussion of postregistration and RRs for preexisting data.

Conclusion

Use of open science practices in suicide research is still the exception rather than the rule, but a growing number of researchers are embracing the opportunity to future-proof their research. While other fields of research may have a head start, open science in suicide research is definitely now open for business.

Electronic Supplementary Material

The electronic supplementary material is available with the online version of the article at https://doi.org/10.1027/0227-5910/a000859

Author Biographies

Olivia J. Kirtley, PhD, is a Research Foundation Flanders senior postdoctoral research fellow at the Center for Contextual Psychiatry, KU Leuven, Belgium. Dr. Kirtley’s research uses experience sampling methods to investigate dynamic processes involved in suicidal ideation and behavior. She cochairs the IASP Early Career Group and is an editorial board member of Crisis.

Julie J. Janssens, is a clinical psychologist and a doctoral researcher at the Center for Contextual Psychiatry, KU Leuven, Belgium. Her research is mainly focused on investigating predictors and correlates of self-harm thoughts and behaviors in young people, using the experience sampling methodology, with a special focus on attachment relationships.

Aleksandra Kaurin, PhD, is an assistant professor of child and adolescent clinical psychology at Witten/Herdecke University in Germany and a licensed clinical psychologist. She studies developmental trajectories of socio-affective (dys)regulation, their links to developmental psychopathology, and how they manifest in the daily lives of children and adolescents.

References

  • Benning, S. D., Bachrach, R. L., Smith, E. A., Freeman, A. J., & Wright, A. G. C. (2019). The registration continuum in clinical science: A guide toward transparent practices. Journal of Abnormal Psychology, 128(6), 528–540. 10.1037/abn0000451 First citation in articleCrossref MedlineGoogle Scholar

  • Beyer, F., Flannery, J., Gau, R., Janssen, L., Schaare, L., Hartmann, H., Nilsonne, G., Martin, S., Khalil, A., Lipp I., Puhlmann, L., Heinrichs, H., Mohamed, A., Herholz, P., Sicorello, M., & Panagoulas, E. (2021). A fMRI pre-registration template. 10.23668/PSYCHARCHIVES.5121 First citation in articleCrossrefGoogle Scholar

  • Carpenter, T. P., & Law, K. C. (2021). Optimizing the scientific study of suicide with open and transparent research practices. Suicide and Life-Threatening Behavior, 51(1), 36–46. 10.1111/sltb.12665 First citation in articleCrossref MedlineGoogle Scholar

  • Cerel, J., Brown, M. M., Maple, M., Singleton, M., Venne, J., Moore, M., & Flaherty, C. (2019). How many people are exposed to suicide? Not six. Suicide and Life-Threatening Behavior, 49(2), 529–534. 10.1111/sltb.12450 First citation in articleCrossref MedlineGoogle Scholar

  • Chambers, C. D., & Tzavella, L. (2021). The past, present and future of registered reports. Nature Human Behaviour, 6, 29–42. 10.1038/s41562-021-01193-7 First citation in articleCrossref MedlineGoogle Scholar

  • Coppersmith, D. D., Fortgang, R., Kleiman, E., Millner, A., Yeager, A., Mair, P., & Nock, M. (2020). Frequent assessment of suicidal thinking does not increase suicidal thinking: Evidence from a high-resolution real-time monitoring study. PsyArXiv Preprints. https://psyarxiv.com/6bh82 First citation in articleGoogle Scholar

  • Crüwell, S., & Evans, N. J. (2021). Preregistration in diverse contexts: A preregistration template for the application of cognitive models. Royal Society Open Science, 8(10), Article 210155. 10.1098/rsos.210155 First citation in articleCrossref MedlineGoogle Scholar

  • DelPozo-Banos, M., Lee, S. C., Friedmann, Y., Akbari, A., Torabi, F., Lloyd, K., Lyons, R. A., & John, A. (2021). Healthcare presentations with self-harm and the association with COVID-19: An e-cohort whole-population-based study using individual-level linked routine electronic health records in Wales, UK, 2016–March 2021. medRxiv. 10.1101/2021.08.13.21261861 First citation in articleCrossrefGoogle Scholar

  • Dobias, M., Schleider, J. L., Jans, L., & Fox, K. (2021). An online, single-session intervention for adolescent self-injurious thoughts and behaviors: Results from a randomized trial [Data, code, materials]. https://osf.io/wfdzp First citation in articleGoogle Scholar

  • Franklin, J. C., Ribeiro, J. D., Fox, K. R., Bentley, K. H., Kleiman, E. M., Huang X., Musacchio, K. M., Jaroszewski, A. C., Chang, B. P., & Nock, M. K. (2017). Risk factors for suicidal thoughts and behaviors: A meta-analysis of 50 years of research. Psychological Bulletin, 143(2), 187–232. 10.1037/bul0000084 First citation in articleCrossref MedlineGoogle Scholar

  • Gelman, A., & Loken, E. (2013). The garden of forking paths: Why multiple comparisons can be a problem, even when there is no “fishing expedition” or “p-hacking” and the research hypothesis was posited ahead of time. http://www.stat.columbia.edu/∼gelman/research/unpublished/p_hacking.pdf First citation in articleGoogle Scholar

  • Hartman, A., Kern, F., & Mellor, D. T. (2019). Preregistration for qualitative research template. https://osf.io/j7ghv First citation in articleGoogle Scholar

  • Holman, M. S., & Williams, M. N. (2019). Risk and protective factors for suicide: A network analysis. https://osf.io/f395g First citation in articleGoogle Scholar

  • Janssens, J. J., Myin-Germeys I., Lafit, G., Achterhof, R., Hagemann, N., Hermans, K. S. F. M., Hiekkaranta, A. P., Lecei, A., & Kirtley, O. J. (2021). Stage 1 registered report: Lifetime and current self-harm thoughts and behaviours and their relationship to parent and peer attachment [Manuscript submitted for publication]. First citation in articleGoogle Scholar

  • Kathawalla, U.-K., Silverstein, P., & Syed, M. (2021). Easing into open science: A guide for graduate students and their advisors. Collabra: Psychology, 7(1). 10.1525/collabra.18684 First citation in articleCrossrefGoogle Scholar

  • Kaurin, A., Wright, A. G., Dombrovski, A., & Hallquist, M. (2020). Momentary interpersonal processes of suicidal surges in borderline personality disorder. https://osf.io/zpc3u First citation in articleCrossrefGoogle Scholar

  • Kaurin, A., Wright, A. G., Hallquist, M., & Dombrovski, A. (2020). Momentary interpersonal processes of suicidal surges in borderline personality disorder. PsyArXiv Preprints. https://psyarxiv.com/es35w/ First citation in articleCrossrefGoogle Scholar

  • Kaurin, A., Wright, A. G., Hisler, G., Dombrovski, A., & Hallquist, M. (2020). Sleep and next-day suicidal ideation in persons with borderline personality disorder. 10.17605/OSF.IO/4VUGK First citation in articleCrossrefGoogle Scholar

  • Kirtley, O. J. (2022). Advancing credibility in longitudinal research by implementing open science practices: Opportunities, practical examples, and challenges. Infant and Child Development, Article e2302. 10.1002/icd.2302 First citation in articleCrossrefGoogle Scholar

  • Kirtley, O. J., Hiekkaranta, A., Kunkels, Y. K., Eisele, G., Verhoeven, D., van Nierop, M., & Myin-Germeys, I. (2019). The experience sampling method item repository [Database]. https://osf.io/kg376 First citation in articleGoogle Scholar

  • Kirtley, O. J., Hussey I., & Marzano, L. (2021). Exposure to and experience of self-harm and self-harm related content: An exploratory network analysis. Psychiatry Research, 295, Article 113572. 10.1016/j.psychres.2020.113572 First citation in articleCrossref MedlineGoogle Scholar

  • Kirtley, O. J., Lafit, G., Achterhof, R., Hiekkaranta, A. P., & Myin-Germeys, I. (2021). Making the black box transparent: A template and tutorial for registration of studies using experience-sampling methods. Advances in Methods and Practices in Psychological Science, 4(1). 10.1177/2515245920924686 First citation in articleCrossrefGoogle Scholar

  • Kirtley, O. J., Rodham, K., & Crane, C. (2019). Understanding suicidal ideation and behaviour in individuals with chronic pain: A review of the role of novel transdiagnostic psychological factors. https://osf.io/6upka First citation in articleGoogle Scholar

  • Knipe, D., & Rajapakse, T. (2021). Self-poisoning in Sri Lanka – COVID-19 [Analysis plan]. https://osf.io/4zfns First citation in articleGoogle Scholar

  • Knipe, D., Silva, T., Aroos, A., Senarathna, L., Hettiarachchi, N. M., Galappaththi, S. R., Spittal, M. J., Gunnell, D., Metcalfe, C., & Rajapakse, T. (2021). Hospital presentations for self-poisoning during COVID-19 in Sri Lanka: An interrupted time-series analysis. The Lancet Psychiatry, 8(10), 892–900. 10.1016/s2215-0366(21)00242-x First citation in articleCrossref MedlineGoogle Scholar

  • Kothgassner, O., Robinson, K., Goreis, A., Ougrin, D., & Plener, P. (2020). Therapeutic interventions for self-harm and suicidal ideation in adolescents [Data and analysis code]. https://osf.io/vr52s First citation in articleGoogle Scholar

  • Mellor, D. T., Esposito, J., Hardwicke, T. E., Nosek, B. A., Cohoon, J., Soderberg, C. K., Kidwell, M. C., Clyburne-Sherin, A., Buck, S., DeHaven, A. C., & Speidel, R. (2019). Preregistration challenge: Plan, test, discover. Transparent changes template. https://osf.io/yrvcg/ First citation in articleGoogle Scholar

  • Mertens, G., & Krypotos, A.-M. (2019). Preregistration of analyses of preexisting data. Psychological Belgica, 59(1), 338–352. 10.5334/pb.493 First citation in articleCrossref MedlineGoogle Scholar

  • Millner, A. (2016). Single-item measurement of suicidal behaviors: Validity and consequences of misclassification [Dataset, analysis code, questionnaire measures]. https://osf.io/nhrt4 First citation in articleGoogle Scholar

  • Millner, A. J., Robinaugh, D. J., & Nock, M. K. (2020). Advancing the understanding of suicide: The need for formal theory and rigorous descriptive research. Trends in Cognitive Sciences, 24(9), 704–716. 10.1016/j.tics.2020.06.007 First citation in articleCrossref MedlineGoogle Scholar

  • Moshontz, H., Binion, G., Walton, H., Brown, B. T., & Syed, M. (2021). A guide to posting and managing preprints. Advances in Methods and Practices in Psychological Science, 4(2). 10.1177/25152459211019948 First citation in articleCrossrefGoogle Scholar

  • Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 0021. 10.1038/s41562-016-0021 First citation in articleCrossref MedlineGoogle Scholar

  • Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van ’t Veer, A. E., & Vazire, S. (2019). Preregistration is hard, and worthwhile. Trends in Cognitive Sciences, 23(10), 815–818. 10.1016/j.tics.2019.07.009 First citation in articleCrossref MedlineGoogle Scholar

  • Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2600–2606. 10.1073/pnas.1708274114 First citation in articleCrossref MedlineGoogle Scholar

  • O'Connor, D. B., Green, J., Ferguson, E., O'Carroll, R. E., & O'Connor, R. C. (2018). Effects of childhood trauma on cortisol levels in suicide attempters and ideators. Psychoneuroendocrinology, 88, 9–16. 10.1016/j.psyneuen.2017.11.004 First citation in articleCrossref MedlineGoogle Scholar

  • Pirkis, J. (2020). Strengthening the evidence base in suicide prevention. Crisis, 41(2), 77–81. 10.1027/0227-5910/a000693 First citation in articleLinkGoogle Scholar

  • Pirkis, J., John, A., Shin, S., DelPozo-Banos, M., Arya V., Analuisa-Aguilar, P., Appleby, L., Arensman, E., Bantjes, J., Baran, A., Bertolote, J. M., Borges, G., Brečić, P., Caine, E., Castelpietra, G., Chang, S.-S., Colchester, D., Crompton, D., Curkovic, M., … Spittal, M. J. (2021). Suicide trends in the early months of the COVID-19 pandemic: An interrupted time-series analysis of preliminary data from 21 countries. The Lancet Psychiatry, 8(7), 579–588. 10.1016/s2215-0366(21)00091-2 First citation in articleCrossref MedlineGoogle Scholar

  • Quintana, D. S. (2020a, December 5). Five things about open and reproducible science that every early career researcher should know Open Science Framework. 10.17605/OSF.IO/DZTVQ First citation in articleCrossrefGoogle Scholar

  • Quintana, D. S. (2020b). A synthetic dataset primer for the biobehavioural sciences to promote reproducibility and hypothesis generation. Elife, 9. 10.7554/eLife.53275 First citation in articleCrossrefGoogle Scholar

  • Robinson, K., & Wilson, M. S. (2020). Measurement of non-suicidal self-injury [Dataset, analysis code, study questionnaires, manuscript preprint]. 10.17605/OSF.IO/8GWJU First citation in articleCrossrefGoogle Scholar

  • Sandford, D. M., Kirtley, O. J., Thwaites, R., Dagnan, D., & O’Connor, R. C. (2021). The adaptation of a measure of confidence in assessing, formulating, and managing suicide risk. Crisis. Advance online publication. 10.1027/0227-5910/a000830 First citation in articleLinkGoogle Scholar

  • Simons, D. J. (2018). Invited forum: Challenges in making data available. Advances in Methods and Practices in Psychological Science, 1(1), 10.1177/2515245918757424 First citation in articleCrossrefGoogle Scholar

  • Soderberg, C. K., Errington, T. M., Schiavone, S. R., Bottesini, J., Thorn, F. S., Vazire, S., Esterling, K. M., & Nosek, B. A. (2021). Initial evidence of research quality of registered reports compared with the standard publishing model. Nature Human Behaviour, 5(8), 990–997. 10.1038/s41562-021-01142-4 First citation in articleCrossref MedlineGoogle Scholar

  • Soderberg, C. K., Sallans, A., Clyburne-Sherin, A., Spitzer, M., Sullivan I., Smith, J. F., & Mellor, D. T. (2019, December 18). IRB and consent form examples. https://osf.io/g4jfv/ First citation in articleGoogle Scholar

  • Tackett, J. L., Brandes, C. M., & Reardon, K. W. (2019). Leveraging the Open Science Framework in clinical psychological assessment research. Psychological Assessment, 31(12), 1386–1394. 10.1037/pas0000583 First citation in articleCrossref MedlineGoogle Scholar

  • Tackett, J. L., Lilienfeld, S. O., Patrick, C. J., Johnson, S. L., Krueger, R. F., Miller, J. D., Oltmanns, T. F., & Shrout, P. E. (2017). It's time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Perspectives on Psychological Science, 12(5), 742–756. 10.1177/1745691617690042 First citation in articleCrossref MedlineGoogle Scholar

  • van den Akker, O., Weston, S. J., Campbell, L., Chopik, W. J., Damian, R. I., Davis-Kean, P., Hall, A. N., Kosie, J. E., Kruse, E., Olsen, J., Ritchie, S. J., Valentine, K. D., van ’t Veer, A. E., & Bakker, M. (2019). Preregistration of secondary data analysis: A template and tutorial. PsyArXiv. 10.31234/osf.io/hvfmr First citation in articleCrossrefGoogle Scholar

  • Vazire, S., Schiavone, S. R., & Bottesini, J. G. (2021). Credibility beyond replicability: Improving the four validities in psychological science. 10.31234/osf.io/bu4d3 First citation in articleCrossrefGoogle Scholar

  • Warne, N., Heron, J., Mars, B., Solmi, F., Moran, P., Stewart, A., Munafo, M., Penton-Voak I., Biddle, L., Skinner, A., Gunnell, G., & Bould, H. (2020). Emotion dysregulation, self-harm and disordered eating: A protocol for a mechanistic investigation. 10.17605/OSF.IO/DPCJB First citation in articleCrossrefGoogle Scholar

  • Weston, S. J., Ritchie, S. J., Rohrer, J. M., & Przybylski, A. K. (2019). Recommendations for increasing the transparency of analysis of preexisting data sets. Advances in Methods and Practices in Psychological Science, 2(3), 214–227. 10.1177/2515245919848684. First citation in articleCrossref MedlineGoogle Scholar

  • World Health Organization. (2021). Suicide worldwide in 2019. https://www.who.int/publications/i/item/9789240026643 First citation in articleGoogle Scholar