Open Science in der deutschsprachigen Sportpsychologie
Eine empirische Analyse aktueller Forschungspraktiken
Abstract
Zusammenfassung: Ziel des vorliegenden Artikels ist es, die Verbreitung unterschiedlicher Open-Science-Praktiken in Publikationen der deutschsprachigen Sportpsychologie zu untersuchen. Dazu wurde die in den Jahren 2020 und 2021 von Arbeitsgruppen in Deutschland, Österreich und der Schweiz publizierte sportpsychologische Literatur analysiert. Die Ergebnisse legen nahe, dass verschiedene Open-Science-Praktiken unterschiedlich häufig genutzt werden: Während beispielsweise knapp 50 % der analysierten Artikel in Open-Access-Modellen publiziert wurden, wurde in nur 2,4 % der Artikel der zur Analyse verwendete Code mitveröffentlicht. Zukünftige Diskussionen um Open-Science-Praktiken in der Sportpsychologie und Maßnahmen zu ihrer Förderung könnten somit von einem differenzierteren Umgang mit einzelnen Praktiken profitieren, statt Bezug auf das übergeordnete Konzept „Open Science“ zu nehmen. Die vorliegenden Daten können zudem als Grundlage für weitere Diskussionen zu Open Science in der deutschsprachigen Sportpsychologie dienen sowie als Vergleichsstandard für zukünftige Entwicklungen genutzt werden.
Abstract: This paper serves to determine the prevalence of several open science practices in publications from German-speaking sports and exercise psychologists. To this end, we analyzed the sport and exercise literature published in 2020 and 2021 by research groups in Austria, Germany, and Switzerland. Results suggest that the prevalence varies substantially for different practices. For example, whereas nearly 50 % of the papers were published in an open access model, the actual code was made public parallel to only 2.4 % of the papers. Future discussions of open science practices in sport and exercise psychology and interventions intended to increase their usage might benefit from targeting specific practices instead of referring broadly to the open science debate. Furthermore, the present data may be used as input for future discussions and as a benchmark for future developments.
Literatur
2019). Attitudes towards open science and public data sharing: A survey among members of the German Psychological Society. Social Psychology, 50, 252 – 260. https://doi.org/10.1027/1864-9335/a000384
(2023). Researchers on research integrity: A survey of European and American researchers. F1000Research, 12, Article 187 https://doi.org/10.12688/f1000research.128733.1
(2020). Recommendations in preregistrations and internal review board proposals promote formal power analyses but do not increase sample size. PLoS ONE, 15 (7), Article e0236079 https://doi.org/10.1371/journal.pone.0236079
(2016). Evidence on questionable research practices: The good, the bad, and the ugly. Journal of Business and Psychology, 31, 323 – 338. https://doi.org/10.1007/s10869-016-9456-7
(2012). Predatory publishers are corrupting open access. Nature, 489, Article 179 https://doi.org/10.1038/489179a
(2022). A template for preregistration of quantitative research in psychology: Report of the joint psychological societies preregistration task force. American Psychologist, 77, 602 – 615. https://doi.org/10.1037/amp0000879
(2019). Sportpsychologie. Berlin: Springer.
(2020). Moving sport and exercise science forward: A call for the adoption of more transparent research practices. Sports Medicine, 50, 449 – 459. https://doi.org/10.1007/s40279-019-01227-1
(n. d.). Registered Reports: Peer review before results are known to align scientific values and practices. Verfügbar unter https://www.cos.io/initiatives/registered-reports
(2019). Global perspectives of research data sharing: A systematic literature review. Library & Information Science Research, 41, 109 – 122. https://doi.org/10.1016/j.lisr.2019.04.004
(2020). Open science practices are on the rise: The State of Social Science (3S) Survey. MetaArXiv. https://doi.org/10.31222/osf.io/5rksu
(2021). Towards open science for the qualitative researcher: From a positivist to an open interpretation. International Journal of Qualitative Methods, 20. https://doi.org/10.1177/16094069211034641
(n. d.). Was ist Open Access? Verfügbar unter https://www.dfg.de/foerderung/programme/infrastruktur/lis/open_access/was_ist_open_access/index.html
(2016). Die „Vertrauenskrise“ der empirischen Sozialwissenschaften und deren Bedeutung für die Sportpsychologie: Ein Kommentar aus der Perspektive der Vertrauensforschung. Zeitschrift für Sportpsychologie, 23, 92 – 98. https://doi.org/10.1026/1612-5010/a000168
(2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS One, 4, Article e5738 https://doi.org/10.1371/journal.pone.0005738
(2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90, 891 – 904. https://doi.org/10.1007/s11192-011-0494-7
(2012). A vast graveyard of undead theories: Publication bias and psychological science’s aversion to the null. Perspectives on Psychological Science, 7, 555 – 561. https://doi.org/10.1177/1745691612459059
(2017). What constitutes strong psychological science? The (neglected) role of diagnosticity and a priori theorizing. Perspectives on Psychological Science, 12, 46 – 61. https://doi.org/10.1177/1745691616654458
(2018). The creative cycle and the growth of psychological science. Perspectives on Psychological Science, 13, 433 – 438. https://doi.org/10.1177/1745691617745651
(2016). Questionable research practices revisited. Social Psychological and Personality Science, 7, 45 – 52. https://doi.org/10.1177/1948550615612150
(2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 5, 6203, 1502 – 1505. https://doi.org/10.1126/science.1255484
(2022). Responsible research assessment: Implementing DORA for hiring and promotion in psychology. https://doi.org/10.17605/OSF.IO/4WYNR
(2016). Wege aus der Vertrauenskrise. Zeitschrift für Sportpsychologie, 23, 99 – 109. https://doi.org/10.1026/1612-5010/a000167
(2021). Management und Bereitstellung von Forschungsdaten in der Psychologie. Psychologische Rundschau, 72, 132 – 146. https://doi.org/10.1026/0033-3042/a000514
(2022). Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in the Netherlands. PLoS One, 7, 2, Article e0263023 https://doi.org/10.1371/journal.pone.0263023
(2022). Barriers to full participation in the open science life cycle among early career researchers. Data Science Journal, 21 Article 2 https://doi.org/10.5334/dsj-2022-002
(2018). Strengthening the practice of exercise and sport-science research. International Journal of Sports Physiology and Performance, 13, 127 – 134. https://doi.org/10.1123/ijspp.2017-0322
(2022). Estimating the prevalence of transparency and reproducibility-related research practices in psychology (2014 – 2017). Perspectives on Psychological Science, 17, 239 – 251. https://doi.org/10.1177/1745691620979806
(2023). Reducing bias, increasing transparency and calibrating confidence with preregistration. Nature Human Behavior, 7, 15 – 26. https://doi.org/10.1038/s41562-022-01497-2
(2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science, 1, 70 – 85. https://doi.org/10.1177/25152445917751886
(2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524 – 532. https://doi.org/10.1177/0956797611430953
(1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2, 196 – 217. https://doi.org/10.1207/s15327957pspr0203_4
(2014). Observed power, and what to do if your editor asks for post-hoc power analyses. The 20 % Statistician. Verfügbar unter: https://daniellakens.blogspot.com/2014/12/observed-power-and-what-to-do-if-your.html
(2014). Sailing from the seas of chaos into the corridor of stability: Practical recommendations to increase the informational value of studies. Perspectives on Psychological Science, 9, 278 – 292. https://doi.org/10.1177/1745691614528520
(2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70, 487 – 498. https://doi.org/10.1037/a0039400
(2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 1 – 9. https://doi.org/10.1038/s41562-016-0021
(2018). Psychology’s renaissance. Annual Review of Psychology, 69, 511 – 534. https://doi.org/10.1146/annurev-psych-122216-011836
(2022). Replicability, robustness, and reproducibility in psychological science. Annual Review of Psychology, 73, 719 – 748. https://doi.org/10.1146/annurev-psych-020821-114157
(2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615 – 631. https://doi.org/10.1177/1745691612459058
(2015). Estimating the reproducibility of psychological science. Science, 349, 1 – 8. https://doi.org/10.1126/science.aac4716
(Eds.). (2012). Special section on replicability in psychological science: A crisis of confidence? [Special section]. Perspectives on Psychological Science, 7, 528 – 654. https://doi.org/10.1177/1745691612465253
(2015). From pre-registration to publication: A non-technical primer for conducting a meta-analysis to synthesize correlational data. Frontiers in Psychology, 6, Article 1549 https://doi.org/10.3389/fpsyg.2015.01549
(2017). Sport and exercise psychology in 2050. German Journal of Exercise and Sport Research, 47, 62 – 71. https://doi.org/10.1007/s12662-016-0435-y
(1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638 – 641. https://doi.org/10.1037/0033-2909.86.3.638
(2017). What is open peer review? A systematic review. F1000Research, 6, Article 588 https://doi.org/10.12688/f1000research.11369.2
(2013). Conflict of interest in open-access publishing. The New England Journal of Medicine, 360, Article 491 https://doi.org/10.1056/NEJMc1307577
(2022). Responsible research assessment I: Implementing DORA for hiring and promotion in psychology. PsyArXiv. https://doi.org/10.31234/osf.io/rgh5b
(2019).
(Einführung in die Sportpsychologie . In J. SchülerM. WegnerH. Plessner (Hrsg.), Lehrbuch Sportpsychologie – Theoretische Grundlagen und Anwendung (S. 1 – 14). Berlin: Springer.2016a). Die Vertrauenskrise empirischer Forschung in der Psychologie: Ausgewählte Ursachen und exemplarische Lösungsvorschläge für die sportpsychologische Forschung. Zeitschrift für Sportpsychologie, 23, 77 – 83. https://doi.org/10.1026/1612-5010/a000171
(2016b). Reproducible research in sport and exercise psychology: The role of sample sizes. Psychology of Sport and Exercise, 23, 114 – 122. https://doi.org/10.1016/j.psychsport.2015.11.005
(2014). p-curve and effect size: Correcting for publication bias using only significant results. Perspectives on Psychological Science, 9, 666 – 681. https://doi.org/10.1177/1745691614553988
(2022). Trusting on the shoulders of open giants? Open science increases trust in science for the public and academics. Journal of Communication, 72, 497 – 510. https://doi.org/10.1093/joc/jqac017
(2019). How replicable are links between personality traits and consequential life outcomes? The Life Outcomes of Personality Replication Project. Psychological Science, 30, 711 – 727. https://doi.org/10.1177/0956797619831612
(2017). Early-career researchers’ perceptions of the prevalence of questionable research practices, potential causes, and open science. Social Psychology, 48, 365 – 371. https://doi.org/10.1027/1864-9335/a000324
(2018). Open science in sport and exercise psychology: Review of current approaches and considerations for qualitative inquiry. Psychology of Sport and Exercise, 36, 17 – 28. https://doi.org/10.1016/j.psychsport.2017.12.010
(2021). Data sharing practices and data availability upon request differ across scientific disciplines. Scientific Data, 8, Article 192 https://doi.org/10.1038/s41597-021-00981-0
(2016). The academic, economic and societal impacts of Open Access: An evidence-based review. F1000Research, 5, Article 632 https://doi.org/10.12688/f1000research.8460.3
(2017). Open Science in der Sportwissenschaft? Ein Wegweiser zur Präregistrierung von Forschungsvorhaben und zu offenem Material, offenen Daten und offenem Code. Zeitschrift für Sportpsychologie, 24, 92 – 99. https://doi.org/10.1026/1612-5010/a000205
(2018). Self-reports from behind the scenes: Questionable research practices and rates of replication in ego depletion research. PLoS One, 3, 6, Article e0199554 https://doi.org/10.1371/journal.pone.0199554
(