Skip to main content
Open AccessOriginalarbeit

Open Science in der deutschsprachigen Sportpsychologie

Eine empirische Analyse aktueller Forschungspraktiken

Published Online:https://doi.org/10.1026/1612-5010/a000404

Zusammenfassung: Ziel des vorliegenden Artikels ist es, die Verbreitung unterschiedlicher Open-Science-Praktiken in Publikationen der deutschsprachigen Sportpsychologie zu untersuchen. Dazu wurde die in den Jahren 2020 und 2021 von Arbeitsgruppen in Deutschland, Österreich und der Schweiz publizierte sportpsychologische Literatur analysiert. Die Ergebnisse legen nahe, dass verschiedene Open-Science-Praktiken unterschiedlich häufig genutzt werden: Während beispielsweise knapp 50 % der analysierten Artikel in Open-Access-Modellen publiziert wurden, wurde in nur 2,4 % der Artikel der zur Analyse verwendete Code mitveröffentlicht. Zukünftige Diskussionen um Open-Science-Praktiken in der Sportpsychologie und Maßnahmen zu ihrer Förderung könnten somit von einem differenzierteren Umgang mit einzelnen Praktiken profitieren, statt Bezug auf das übergeordnete Konzept „Open Science“ zu nehmen. Die vorliegenden Daten können zudem als Grundlage für weitere Diskussionen zu Open Science in der deutschsprachigen Sportpsychologie dienen sowie als Vergleichsstandard für zukünftige Entwicklungen genutzt werden.


Open Science in the German-Speaking Sport Psychology Research Community. An Empirical Analysis of Current Research Practices

Abstract: This paper serves to determine the prevalence of several open science practices in publications from German-speaking sports and exercise psychologists. To this end, we analyzed the sport and exercise literature published in 2020 and 2021 by research groups in Austria, Germany, and Switzerland. Results suggest that the prevalence varies substantially for different practices. For example, whereas nearly 50 % of the papers were published in an open access model, the actual code was made public parallel to only 2.4 % of the papers. Future discussions of open science practices in sport and exercise psychology and interventions intended to increase their usage might benefit from targeting specific practices instead of referring broadly to the open science debate. Furthermore, the present data may be used as input for future discussions and as a benchmark for future developments.

Literatur

  • Abele-Brehm, A., Gollwitzer, M., Steinberg, U. & Schönbrodt, F. (2019). Attitudes towards open science and public data sharing: A survey among members of the German Psychological Society. Social Psychology, 50, 252 – 260. https://doi.org/10.1027/1864-9335/a000384 First citation in articleLinkGoogle Scholar

  • Allum, N., Reid, A., Bidoglia, M., Gaskell, G., Aubert-Bonn, N., Buljan, I. et al. (2023). Researchers on research integrity: A survey of European and American researchers. F1000Research, 12, Article 187 https://doi.org/10.12688/f1000research.128733.1 First citation in articleCrossrefGoogle Scholar

  • Bakker, M., Veldkamp, C. L. S., van den Akker, O. R., van Assen, M. A. L. M., Crompvoets, E. & Ong, H. H. et. al. (2020). Recommendations in preregistrations and internal review board proposals promote formal power analyses but do not increase sample size. PLoS ONE, 15 (7), Article e0236079 https://doi.org/10.1371/journal.pone.0236079 First citation in articleCrossrefGoogle Scholar

  • Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S. & Rupp, D. E. (2016). Evidence on questionable research practices: The good, the bad, and the ugly. Journal of Business and Psychology, 31, 323 – 338. https://doi.org/10.1007/s10869-016-9456-7 First citation in articleCrossrefGoogle Scholar

  • Beall, J. (2012). Predatory publishers are corrupting open access. Nature, 489, Article 179 https://doi.org/10.1038/489179a First citation in articleCrossrefGoogle Scholar

  • Bosnjak, M., Fiebach, C. J., Mellor, D., Mueller, S., O‘Connor, D. B., Oswald, F. L. et al. (2022). A template for preregistration of quantitative research in psychology: Report of the joint psychological societies preregistration task force. American Psychologist, 77, 602 – 615. https://doi.org/10.1037/amp0000879 First citation in articleCrossrefGoogle Scholar

  • Brand, R. & Schweizer, G. (2019). Sportpsychologie. Berlin: Springer. First citation in articleCrossrefGoogle Scholar

  • Caldwell, A. R., Vigotsky, A. D., Tenan, M. S., Radel, R., Mellor, D. T., Kreutzer, A. et al. (2020). Moving sport and exercise science forward: A call for the adoption of more transparent research practices. Sports Medicine, 50, 449 – 459. https://doi.org/10.1007/s40279-019-01227-1 First citation in articleCrossrefGoogle Scholar

  • Center for Open Science (n. d.). Registered Reports: Peer review before results are known to align scientific values and practices. Verfügbar unter https://www.cos.io/initiatives/registered-reports First citation in articleGoogle Scholar

  • Chawinga, W. D. & Zinn, S. (2019). Global perspectives of research data sharing: A systematic literature review. Library & Information Science Research, 41, 109 – 122. https://doi.org/10.1016/j.lisr.2019.04.004 First citation in articleCrossrefGoogle Scholar

  • Christensen, G., Wang, Z., Paluck, E. L., Swanson, N., Birke, D., Miguel, E. et al. (2020). Open science practices are on the rise: The State of Social Science (3S) Survey. MetaArXiv. https://doi.org/10.31222/osf.io/5rksu First citation in articleCrossrefGoogle Scholar

  • Class, B., de Bruyne, M., Wuillemin, C., Donzé, D. & Claivaz, J.-B. (2021). Towards open science for the qualitative researcher: From a positivist to an open interpretation. International Journal of Qualitative Methods, 20. https://doi.org/10.1177/16094069211034641 First citation in articleCrossrefGoogle Scholar

  • Deutsche Forschungsgemeinschaft (n. d.). Was ist Open Access? Verfügbar unter https://www.dfg.de/foerderung/programme/infrastruktur/lis/open_access/was_ist_open_access/index.html First citation in articleGoogle Scholar

  • Dreiskämper, D. (2016). Die „Vertrauenskrise“ der empirischen Sozialwissenschaften und deren Bedeutung für die Sportpsychologie: Ein Kommentar aus der Perspektive der Vertrauensforschung. Zeitschrift für Sportpsychologie, 23, 92 – 98. https://doi.org/10.1026/1612-5010/a000168 First citation in articleLinkGoogle Scholar

  • Fanelli, D. (2009). How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLoS One, 4, Article e5738 https://doi.org/10.1371/journal.pone.0005738 First citation in articleCrossrefGoogle Scholar

  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 90, 891 – 904. https://doi.org/10.1007/s11192-011-0494-7 First citation in articleCrossrefGoogle Scholar

  • Ferguson, C. J. & Heene, M. (2012). A vast graveyard of undead theories: Publication bias and psychological science’s aversion to the null. Perspectives on Psychological Science, 7, 555 – 561. https://doi.org/10.1177/1745691612459059 First citation in articleCrossrefGoogle Scholar

  • Fiedler, K. (2017). What constitutes strong psychological science? The (neglected) role of diagnosticity and a priori theorizing. Perspectives on Psychological Science, 12, 46 – 61. https://doi.org/10.1177/1745691616654458 First citation in articleCrossrefGoogle Scholar

  • Fiedler, K. (2018). The creative cycle and the growth of psychological science. Perspectives on Psychological Science, 13, 433 – 438. https://doi.org/10.1177/1745691617745651 First citation in articleCrossrefGoogle Scholar

  • Fiedler, K. & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7, 45 – 52. https://doi.org/10.1177/1948550615612150 First citation in articleCrossrefGoogle Scholar

  • Franco, A., Malhotra, N. & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 5, 6203, 1502 – 1505. https://doi.org/10.1126/science.1255484 First citation in articleCrossrefGoogle Scholar

  • Gärtner, A., Leising, D. & Schönbrodt, F. (2022). Responsible research assessment: Implementing DORA for hiring and promotion in psychology. https://doi.org/10.17605/OSF.IO/4WYNR First citation in articleGoogle Scholar

  • Geukes, K., Schönbrodt, F. D., Utesch, T., Geukes, S. & Back, M. D. (2016). Wege aus der Vertrauenskrise. Zeitschrift für Sportpsychologie, 23, 99 – 109. https://doi.org/10.1026/1612-5010/a000167 First citation in articleLinkGoogle Scholar

  • Gollwitzer, M., Abele-Brehm, A., Fiebach, C. J., Ramthun, R., Scheel, A., Schönbrodt, F. et al. (2021). Management und Bereitstellung von Forschungsdaten in der Psychologie. Psychologische Rundschau, 72, 132 – 146. https://doi.org/10.1026/0033-3042/a000514 First citation in articleLinkGoogle Scholar

  • Gopalakrishna, G., ter Riet, G., Vink, G., Stoop, I., Wicherts, J. M. & Bouter, L. M. (2022). Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in the Netherlands. PLoS One, 7, 2, Article e0263023 https://doi.org/10.1371/journal.pone.0263023 First citation in articleCrossrefGoogle Scholar

  • Gownaris, N. J., Vermeir, K., Bittner, M.-I., Gunawardena, L., Kaur-Ghumaan, S., Lepenies, R. et al. (2022). Barriers to full participation in the open science life cycle among early career researchers. Data Science Journal, 21 Article 2 https://doi.org/10.5334/dsj-2022-002 First citation in articleCrossrefGoogle Scholar

  • Halperin, I., Vigotsky, A. D., Foster, C. & Pyne, D. B. (2018). Strengthening the practice of exercise and sport-science research. International Journal of Sports Physiology and Performance, 13, 127 – 134. https://doi.org/10.1123/ijspp.2017-0322 First citation in articleCrossrefGoogle Scholar

  • Hardwicke, T. E., Thibault, R. T., Kosie, J. E., Wallach, J. D., Kidwell, M. C. & Ioannidis, J. P. (2022). Estimating the prevalence of transparency and reproducibility-related research practices in psychology (2014 – 2017). Perspectives on Psychological Science, 17, 239 – 251. https://doi.org/10.1177/1745691620979806 First citation in articleCrossrefGoogle Scholar

  • Hardwicke, T. & Wagenmakers, E.-J. (2023). Reducing bias, increasing transparency and calibrating confidence with preregistration. Nature Human Behavior, 7, 15 – 26. https://doi.org/10.1038/s41562-022-01497-2 First citation in articleCrossrefGoogle Scholar

  • Houtkoop, B. L., Chambers, C., Macleod, M., Bishop, D. V. M., Nichols, T. E. & Wagenmakers, E. J. (2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science, 1, 70 – 85. https://doi.org/10.1177/25152445917751886 First citation in articleCrossrefGoogle Scholar

  • John, L. K., Loewenstein, G. & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524 – 532. https://doi.org/10.1177/0956797611430953 First citation in articleCrossrefGoogle Scholar

  • Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2, 196 – 217. https://doi.org/10.1207/s15327957pspr0203_4 First citation in articleCrossrefGoogle Scholar

  • Lakens, D. (2014). Observed power, and what to do if your editor asks for post-hoc power analyses. The 20 % Statistician. Verfügbar unter: https://daniellakens.blogspot.com/2014/12/observed-power-and-what-to-do-if-your.html First citation in articleGoogle Scholar

  • Lakens, D. & Evers, E. R. (2014). Sailing from the seas of chaos into the corridor of stability: Practical recommendations to increase the informational value of studies. Perspectives on Psychological Science, 9, 278 – 292. https://doi.org/10.1177/1745691614528520 First citation in articleCrossrefGoogle Scholar

  • Maxwell, S. E., Lau, M. Y. & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70, 487 – 498. https://doi.org/10.1037/a0039400 First citation in articleCrossrefGoogle Scholar

  • Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Percie du Sert, N. et al. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 1 – 9. https://doi.org/10.1038/s41562-016-0021 First citation in articleCrossrefGoogle Scholar

  • Nelson, L. D., Simmons, J. & Simonsohn, U. (2018). Psychology’s renaissance. Annual Review of Psychology, 69, 511 – 534. https://doi.org/10.1146/annurev-psych-122216-011836 First citation in articleCrossrefGoogle Scholar

  • Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A. et al. (2022). Replicability, robustness, and reproducibility in psychological science. Annual Review of Psychology, 73, 719 – 748. https://doi.org/10.1146/annurev-psych-020821-114157 First citation in articleCrossrefGoogle Scholar

  • Nosek, B. A., Spies, J. R. & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615 – 631. https://doi.org/10.1177/1745691612459058 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349, 1 – 8. https://doi.org/10.1126/science.aac4716 First citation in articleCrossrefGoogle Scholar

  • Pashler, H. & Wagenmakers, E.-J. (Eds.). (2012). Special section on replicability in psychological science: A crisis of confidence? [Special section]. Perspectives on Psychological Science, 7, 528 – 654. https://doi.org/10.1177/1745691612465253 First citation in articleCrossrefGoogle Scholar

  • Quintana, D. S. (2015). From pre-registration to publication: A non-technical primer for conducting a meta-analysis to synthesize correlational data. Frontiers in Psychology, 6, Article 1549 https://doi.org/10.3389/fpsyg.2015.01549 First citation in articleCrossrefGoogle Scholar

  • Raab, M. (2017). Sport and exercise psychology in 2050. German Journal of Exercise and Sport Research, 47, 62 – 71. https://doi.org/10.1007/s12662-016-0435-y First citation in articleCrossrefGoogle Scholar

  • Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638 – 641. https://doi.org/10.1037/0033-2909.86.3.638 First citation in articleCrossrefGoogle Scholar

  • Ross-Hellauer, T. (2017). What is open peer review? A systematic review. F1000Research, 6, Article 588 https://doi.org/10.12688/f1000research.11369.2 First citation in articleCrossrefGoogle Scholar

  • Salem, D. N. & Boumil, M. M. (2013). Conflict of interest in open-access publishing. The New England Journal of Medicine, 360, Article 491 https://doi.org/10.1056/NEJMc1307577 First citation in articleCrossrefGoogle Scholar

  • Schönbrodt, F. D., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D. et al. (2022). Responsible research assessment I: Implementing DORA for hiring and promotion in psychology. PsyArXiv. https://doi.org/10.31234/osf.io/rgh5b First citation in articleCrossrefGoogle Scholar

  • Schüler, J., Wegner, M. & Plessner, H. (2019). Einführung in die Sportpsychologie. In J. SchülerM. WegnerH. Plessner (Hrsg.), Lehrbuch Sportpsychologie – Theoretische Grundlagen und Anwendung (S. 1 – 14). Berlin: Springer. First citation in articleGoogle Scholar

  • Schweizer, G. & Furley, P. (2016a). Die Vertrauenskrise empirischer Forschung in der Psychologie: Ausgewählte Ursachen und exemplarische Lösungsvorschläge für die sportpsychologische Forschung. Zeitschrift für Sportpsychologie, 23, 77 – 83. https://doi.org/10.1026/1612-5010/a000171 First citation in articleLinkGoogle Scholar

  • Schweizer, G. & Furley, P. (2016b). Reproducible research in sport and exercise psychology: The role of sample sizes. Psychology of Sport and Exercise, 23, 114 – 122. https://doi.org/10.1016/j.psychsport.2015.11.005 First citation in articleCrossrefGoogle Scholar

  • Simonsohn, U., Nelson, L. D. & Simmons, J. P. (2014). p-curve and effect size: Correcting for publication bias using only significant results. Perspectives on Psychological Science, 9, 666 – 681. https://doi.org/10.1177/1745691614553988 First citation in articleCrossrefGoogle Scholar

  • Song, H., Markowitz, D. M. & Taylor, S. H. (2022). Trusting on the shoulders of open giants? Open science increases trust in science for the public and academics. Journal of Communication, 72, 497 – 510. https://doi.org/10.1093/joc/jqac017 First citation in articleCrossrefGoogle Scholar

  • Soto, C. J. (2019). How replicable are links between personality traits and consequential life outcomes? The Life Outcomes of Personality Replication Project. Psychological Science, 30, 711 – 727. https://doi.org/10.1177/0956797619831612 First citation in articleCrossrefGoogle Scholar

  • Stürmer, S., Oeberst, A., Trötschel, R. & Decker, O. (2017). Early-career researchers’ perceptions of the prevalence of questionable research practices, potential causes, and open science. Social Psychology, 48, 365 – 371. https://doi.org/10.1027/1864-9335/a000324 First citation in articleLinkGoogle Scholar

  • Tamminen, K. A. & Poucher, Z. A. (2018). Open science in sport and exercise psychology: Review of current approaches and considerations for qualitative inquiry. Psychology of Sport and Exercise, 36, 17 – 28. https://doi.org/10.1016/j.psychsport.2017.12.010 First citation in articleCrossrefGoogle Scholar

  • Tedersoo, L., Küngas, R., Oras, E., Köster, K., Eeenmaa, H., Leijen, Ä. et al. (2021). Data sharing practices and data availability upon request differ across scientific disciplines. Scientific Data, 8, Article 192 https://doi.org/10.1038/s41597-021-00981-0 First citation in articleCrossrefGoogle Scholar

  • Tennant, J. P., Waldner, F., Jacques, D. C., Masuzzo, P., Collister, L. B. & Hartgerink, C. H. J. (2016). The academic, economic and societal impacts of Open Access: An evidence-based review. F1000Research, 5, Article 632 https://doi.org/10.12688/f1000research.8460.3 First citation in articleCrossrefGoogle Scholar

  • Utesch, T., Dreiskämper, D. & Geukes, K. (2017). Open Science in der Sportwissenschaft? Ein Wegweiser zur Präregistrierung von Forschungsvorhaben und zu offenem Material, offenen Daten und offenem Code. Zeitschrift für Sportpsychologie, 24, 92 – 99. https://doi.org/10.1026/1612-5010/a000205 First citation in articleLinkGoogle Scholar

  • Wolff, W., Baumann, L. & Englert, C. (2018). Self-reports from behind the scenes: Questionable research practices and rates of replication in ego depletion research. PLoS One, 3, 6, Article e0199554 https://doi.org/10.1371/journal.pone.0199554 First citation in articleCrossrefGoogle Scholar