Skip to main content
Open AccessPosition Paper

Open Science in German Sport Psychology

State of the Art and Future Directions

Published Online:https://doi.org/10.1026/1612-5010/a000406

Abstract

Abstract: Open Science is an important development in science, not only to overcome the replication crisis or crisis of confidence but also to openly and transparently describe research processes to enable replication and reproduction. This study describes the current state of the art within German-speaking sport psychology regarding open science-related attitudes, behaviors, and intentions and identifies the reasons for a potential reluctance toward open science. The findings revealed a match between open science-related attitudes and intentions, although open science-related behaviors still fall behind those two. We discovered time constraints, time allocation issues, and anticipated competitive disadvantages if not all researchers adhere to open science practices as the reasons behind this behavioral reluctance. Our findings suggest that the development of open science has clearly reached the German-speaking sport-psychological community, but that there is considerable potential for improvement – especially regarding behaviors.

Open Science in der deutschen Sportpsychologie. Wo wir stehen und wo es hingehen soll

Zusammenfassung: Open Science ist eine wichtige Entwicklung in der Wissenschaft, nicht nur, um die Replikationskrise oder Vertrauenskrise zu überwinden, sondern auch, um offen und transparent Forschungsprozesse zu beschreiben und um Replikation und Reproduktion zu ermöglichen. Die vorliegende Studie beschreibt den aktuellen Stand der Dinge zu Open Science-bezogenen Einstellungen, Verhaltensweisen und Intentionen sowie Gründe für eine mögliche Zurückhaltung bezogen auf Open Science. Insgesamt zeigten die Ergebnisse eine Kongruenz zwischen Open Science-bezogenen Einstellungen und Intentionen, wobei Open Science-bezogenes Verhalten diesen jedoch noch nachsteht. Als Gründe für diese verhaltensbezogene Zurückhaltung gaben die Wissenschaftler_innen zeitliche Limitationen, Zeitkonflikte und antizipierte kompetitive Nachteile, wenn nicht alle Wissenschaftler_innen Open Science betrieben, an. In Summe legen diese Ergebnisse nahe, dass die Open Science Entwicklung klar in der deutschsprachigen Sportpsychologie angekommen ist, aber dass es noch Potential zur Verbesserung gibt, insbesondere bezogen auf das Verhalten.

Empirical research is facing a crisis of confidence (Open Science Collaboration, 2015; Pashler & Wagenmakers, 2012). As a way to overcome this crisis, it was widely proposed that individual researchers should adhere to open science (OS) standards (e. g., Dreiskämper, 2016; Geukes et al., 2016; Schönbrodt & Scheel, 2017); the general movement in the last couple of years have pointed in this direction (e. g., Hardwicke & Ioannidis, 2018). In this position paper, we target the state-of-the-art and future directions of OS within the German-speaking sport-psychology community to shed light on its researchers’ current attitudes and intentions toward OS, on the behaviors with which they engage in OS, and on their perceived barriers hindering further development toward a more open and transparent research practice.

Replicability and Reproducibility:A Crisis of Confidence Across Disciplines

According to Asendorpf and colleagues (2013), replicability in science describes that the findings of an original study A are not substantially different from a replication study B – and if there are any differences, they can be attributed to an unsystematic error. Thus, replicability describes the degree to which we find consistent findings when empirical studies are repeated. Importantly, replicability constitutes one of the definitory characteristics of science (e. g., Bacon, 1859; Jasny et al., 2011; Kuhn, 1962; Popper, 1992) and is even considered a line of demarcation between science and nonscience (Braude, 1979). Accordingly, replicability is essential to scientific research because it not only ensures the reliability and validity of empirical findings, and thus their accuracy and generalizability, it also ensures that previous research provides a sound basis for reliably establishing novel research.

It is all the more surprising that empirical research across disciplines today faces a replicability crisis (e. g., economics or biology, see Chang & Li, 2015; Errington et al., 2014; for overviews across disciplines, see Hoffmann et al., 2021; Munafò, 2016; Open Science Collaboration, 2015; Pashler & Wagenmakers, 2012), as does (sport) psychology (cf. this Special Issue and the preceding one in 2017 in the Zeitschrift für Sportpsychologie;Tamminen & Poucher, 2018), even though the reproducibility debate was recently termed as an opportunity and not a crisis (Munafò et al., 2022). One historic milestone was the so-called Reproducibility Project: Psychology (Open Science Collaboration, 2012, 2015), which selected 100 experimental and correlative studies from three of the most important psychological journals and repeated them exactly. While 97 % of the original studies reported statistically significant effects, only 36 % of the replication studies confirmed these results. At the latest with the publication of these results and these alarming figures, the existence of a replicability crisis could no longer be negated within the psychological community. This insight was also brought to the broad public.

“Crisis” as an Opportunity:A Driving Force Toward Open Science

Transparency, verifiability, replicability, and openness are central values of science (Nosek et al., 2015). As early as 1896, Wilhelm Wundt defined open methodology as one of the core principles for sound experimentation. Thus, it stands to reason that one way out of this so-called crisis is the commitment of individual researchers to OS (Hicks, 2021; Renkewitz & Heene, 2019). OS typically involves practices such as sharing data, analytic code, and the materials behind publications and projects, preregistering studies or using registered reports as a publication format, and publishing preprints and open access. Using these practices is thought to increase replicability on average. It was proposed, for example, that opening up research processes through open data, analytic code, and materials as well as through preregistrations of studies could boost replicability by a priori minimizing researchers’ degrees of freedom and thereby the prevalence of dubious research practices like “HARKing” (i. e., hypothesizing after results are known) and “p-hacking” (i. e., abusing statistical analyses to find and report statistically significant effects (Foster & Deardorff, 2017; Munafò, 2016; Powers & Hampton, 2019). Because registered reports are peer-reviewed before data collection, they are likely to increase the quality of the conducted studies (e. g., regarding sample size, power, design, and measurement considerations) and offer a timely alternative to conventional publication formats. This is why, over the last few years, the OS movement can be regarded as one of the most important developments in scientific research – and still on the rise.

The “Evolution” of Open Science Across Disciplines and the “Modus Operandi” in Sport Psychology

At the individual level, the quantity of preregistered research published, for example, increased considerably between 2014 and 2018 (Hardwicke & Ioannidis, 2018), demonstrating a growing understanding of the benefits of this practice. At the university level, for example, applications for professorships often need to include an OS statement (see, e. g., https://www.nicebread.de/open-science-hiring-practices/), and the increasing number of formations of OS initiatives (e. g., for Germany, see osf.io/tbkzh/) reflect the universities’ commitment toward this development. At the institutional level, the German Psychological Society (Deutsche Gesellschaft für Psychologie, DGPs), for example, has developed and published recommendations for data-management practices in psychological science (Schönbrodt et al., 2017), and the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) asks for explicit statements on the “handling of research data“ in grant proposals, indicating that grant institutions value the provision of data for secondary usage (Deutsche Forschungsgemeinschaft, 2015).

The OS movement is represented in many (also sports-related) fields of studies, for example, sociology (Breznau, 2021), nutrition (Burke et al., 2021), medicine (Bullock et al., 2022), and informatics (Brinkhaus et al., 2023). Within sport and exercise psychology, specifically, researchers have already called for the adoption of more transparent research practices regarding OS (e. g., Caldwell et al., 2020; Geukes et al., 2016), and respective journal editors have started to demand OS adherence in manuscript submissions (e. g., Sport, Exercise and Performance Psychology, International Journal of Behavioral Nutrition and Physical Activity, Psychology of Sport and Exercise, Journal of Sport and Exercise Psychology; see “Data Availability Statements” on the respective journals’ homepage or Beauchamp, 2023; Jago & van der Ploeg, 2018). A review of OS practices across quantitative and qualitative articles in 11 sport and exercise psychology journals revealed that OA articles were cited slightly more often than non-OA articles. However, the authors also revealed that researchers do not appear to consistently and openly share the methods and data of their studies, and that no articles were published as preregistered reports (Tamminen & Poucher, 2018). Another study assessed OS practices in physical activity interventions (Norris et al., 2022) and found that open data, code, and materials as well as replication attempts are currently rare in physical activity behavior-change intervention reports. Only in 4 % of the studies the authors did provide accessible open data, only in 8 % did the authors provide open materials, and only in 1 % did authors provide open analysis scripts.

Why We Do Not Engage in OS Practices Yet: The Theory of Planned Behavior

Previous research showed that researchers generally agree that OS might serve as a way out of the replicability crisis, but they also declared being somewhat reluctant toward this movement (Stürmer et al., 2017), suggesting gaps between their attitudes and intentions, on the one hand, and their behaviors, on the other hand. The theory of planned behavior (TPB; Ajzen, 1985) is a useful theoretical framework for understanding and promoting engagement in OS principles. Applied to OS, the TPB posits that individuals’ OS behaviors depend on their intentions, which in turn depend on their attitudes, subjective norms, and perceived behavioral control. In the context of OS, attitudes toward OS principles may first depend on factors such as the subjectively perceived benefits vs. costs of openness and transparency. Second, subjective norms may be shaped by the explicit and implicit expectations as well as standard practices within the own work group (e. g., colleagues, supervisors), within the scientific community (e. g., fellow scientists, coauthors, mentors), and within organizations and institutions (e. g., funding agencies, scientific journals, scientific societies, universities). Third, perceived behavioral control might be affected by factors such as access to resources (e. g., student assistants, payment of open access publication fees) and the perceived skills required for sharing data, code, and materials as well as for publishing preregistrations and registered reports. As such, attitudes, subjective norms, and perceived behavioral control toward OS vary across individuals – and so should OS intentions and behaviors.

The Status Quo in German Sport Psychology

Based on these notions, OS is undoubtedly an important movement in sport psychology. We conducted an explorative survey among the German-speaking sport-psychology community to provide a first overview of the current state of the art regarding OS practices. Participants were deemed suited for inclusion if they reported currently working in a German-speaking country (i.e., Germany, Austria, Switzerland) in an area related to sport psychology. In the end, 61 researchers aged 25 to 67 years (M = 38.21, SD = 10.94) completed the questionnaire, stemming – per self-reported affiliation – from a broad range of subdisciplines within sport science and psychology. Specifically, we designed a questionnaire that included demographic aspects (6 items) as well as the assessment of five domains involving the key elements of TPB: (1) attitudes toward OS practices (9 items), (2) intentions to adhere to OS standards in the future (9 items), (3) behavior regarding OS practices (10 items), (4) potential explanations of an intention-behavior-gap (if existent, 10 items), and (5) more general attitudes toward OS (14 items). All materials, including detailed methods and participant data, the final questionnaire, a codebook, anonymized data, and code, can be retrieved from osf.io/w3bj6/. The Electronic Supplemantary Material (ESM 1) contains more detailed information about the methods.

Based on this data, we compared individuals of different status groups (i. e., predoc, postdoc, professors) to determine whether their status affects their commitment to open science. In this respect, we refrained from stating a directed hypothesis because, from our perspective, two directions appeared plausible: Early career researchers might (a) quickly adapt to new developments and, thus, adhere to OS standards early on or (b) be hesitant to adapt to new developments and, thus, not yet adhere to OS standards. Describing and understanding attitudes, intentions, behaviors, and potential explanations of an intention-behavior gap can help to identify barriers and facilitators to engage in OS practices in the German-speaking sport-psychology community and can help to inform future interventions and strategies to promote transparency, reproducibility, and openness in the field.

We wanted to describe the current state of affairs regarding OS in the German-speaking sport-psychological community. We want to emphasize that the analyses presented are solely exploratory, and that, given the small sample size, the results presented are only descriptive in nature. The conclusions drawn are, therefore, necessarily preliminary and selective. The ESM 2 contains the item statistics as well as absolute and relative responses for all variables (in the original language). Table 1 shows the item statistics for the whole sample as well as separately for predocs, postdocs, and (junior) professors.

Table 1 Open science-related attitudes, behaviors, and intentions

The findings indicate that researchers within the sample hold a positive attitude toward OS, intend to (increasingly) engage in OS in the future, and already show OS-related behaviors on average. While mean levels in reported attitudes and intentions generally matched each other (ranging between 3.16 and 4.51), the mean levels of reported behaviors, however, comparatively still fell behind both (range between 1.72 and 4.03), indicating room for researchers to more strongly incorporate OS practices into their research processes. In line with the comparatively lower ratings of actual behaviors, only 29 (47 %) researchers reported having an account on the Open Science Framework, 6 (10 %) on GitHub, and only 4 (7 %) on PsyArXiv. Research Gate, considered more of a science-related social exchange than an OS platform, was more widely used, with 58 (95 %) researchers reporting having an account.

Correspondingly, rather general questions regarding attitudes, intentions, and behaviors (e. g., I design my research processes openly and transparently) received comparatively greater approval than questions targeting more specific OS-related measures. Within the latter group of items, there was comparatively greater agreement on the publication of data, analytic code, and materials as well as for the use of preregistrations than for the use of registered reports as publication format. In contrast, we found the lowest approval for publishing the complete anonymized data of research projects. For an illustration of these findings, please refer to Figure 1. Descriptively comparing attitudes, intentions, and behaviors across status groups did not reveal any substantial mean level differences in these variables between predocs, postdocs, and (junior) professors. A more detailed table format in ESM 3 presents these results.

Figure 1 Note. These bar plots portray the mean responses on each of the three attitude, intention, and behavior items for general evaluations of open science (first row), for the use of preregistrations (middle row), and the use of registered reports (third row), respectively, for the whole sample (N = 61) as well as the considered status groups: predocs (n = 23), postdocs (n = 17), (junior) professors (n = 21). The response formats ranged from 1 = does not apply at all to 5 = applies completely. Figure 1. Illustration of attitudes, behaviors, intentions for three selected variables.

Figure 2 shows item statistics for domains of general and specific attitudes for the whole sample and separately for participants reporting to be comparatively strongly engaged in OS-related behaviors (n = 19) and those reporting to be comparatively weakly engaged in OS-related behaviors (n = 37) on further OS-related attitudes. The assignment of participants to one of the two groups (low vs. high OS behavior) was based on two variables: the report on general OS-related behaviors (> 3) and the report on using preregistrations within research processes (> 3). In our view, these two should, to some degree, validly approximate researchers’ general behavioral tendencies toward OS.

Figure 2 Note. This bar plot portrays the mean responses on six selected items reflecting potential barriers for the whole sample (N = 61) as well as for individuals assigned to one of two groups: high vs. low open science behavior was defined based on two behavior variables: general OS behavior and preregistrations; if both responses were greater than 3, participants were included into the high OS behavior group (n = 19), if they were not, participants were included into the low OS behavior group (n = 37). The response formats ranged from 1 = does not apply at all to 5 = applies completely. Figure 2. Illustration of selected general and specific attitudes.

Corresponding to the reported descriptions on OS attitudes and intentions above, researchers within the sample generally agreed that science should be open and transparent, that OS practices are good for science, are not superfluous, and should be the norm. Interestingly, only to a comparatively lesser extent did they agree that they will become the norm in the future. Participants reported greater trust in study results if research processes are open and transparent, and to a lesser extent, this trust also regards the scientists adhering (or not) to open and transparent research processes. Regarding more specific aspects of OS, participants generally appreciate the opportunity of submitting registered reports and grant institutions as well as journals making the publication of data along with publications mandatory. Although to a slightly lesser extent, they also appreciate that adherence to OS should play a role in selection decisions for academic positions. They welcome journals’ preference for preregistered studies over nonpreregistered ones as well as journals making the submission of analytic code mandatory. Approval was comparatively lowest for grant institutions making the publication of anonymized project data mandatory.

A descriptive comparison of the groups of participants who reported being strongly engaged in OS-related behaviors with those being comparatively weakly engaged in OS-related behaviors indicated higher mean levels for the former on all variables than for the latter, though most differences were relatively small. The most remarkable differences, in our view, were identified within the domain of specific attitudes for variables closely connected to individual (dis–)‌advantages, namely, to the journals’ preferences for preregistered studies over not preregistered ones, and to the consideration of OS behaviors in selection decisions for academic positions.

Figure 3 presents the putative factors contributing to an OS-related intention-behavior gap (e. g., barriers). It summarizes item answers on potential reasons hindering engaging more strongly in OS practices for the whole sample and separately for the two groups considered above.

Figure 3 Note. This bar plot portrays the mean responses on six selected items reflecting potential barriers for the whole sample (N = 61) as well as for individuals assigned to one of two groups: high vs. low open science behavior was defined based on two behavior variables: general OS behavior and preregistrations; if both responses were greater than 3, participants were included into the high open science behavior group (n = 19), if they were not, participants were included into the low open science behavior group (n = 37). The response formats ranged from 1 = does not apply at all to 5 = applies completely. Figure 3. Illustration of selected potential barriers.

Among the top three reasons (with mean levels of 3 and higher) why researchers do not more strongly engage in OS practices were (1) that the researchers feel they do not have the time for it, (2) that they consider their time better invested in writing papers, and (3) that they would indeed do so if all others were forced to do so as well. The latter reason might directly correspond to the subjective norm in TPB. These reasons were followed by variables with mean levels lower than 3: a lack of training opportunities, a poor cost-benefit ratio, and not having a concrete idea of how to do it. Here, the perceived lack of training opportunities and concrete ideas on how to do it might reflect a lack of perceived behavioral control in light of the TPB. Only then could anxiety of making mistakes be identified as relevant, followed by not seeing any benefits, worrying about competitive disadvantages, and holding OS as detrimental to one’s scientific career.

A descriptive comparison of the groups of participants who reported comparatively strongly engaging in OS-related behaviors with those who reported doing less so indicated that the former group had lower mean levels on all variables, which agrees with the assumption that researchers already showing higher levels of OS behavior were either able to already overcome the individually perceived barriers or never experienced OS to go along with any costs.

Discussion of Findings andFuture Directions

The OS movement is important and, in our view, indispensable to overcoming the replication crisis currently being confronted in the behavioral sciences. During the last years, one can observe considerable progress toward a more open and transparent research and publication process in several scientific disciplines and communities (Schneider, 2022). To describe the current state of the art regarding OS in the German-speaking sport-psychology community, we presented data from a survey study from 2022. It provides a first overview of OS attitudes, intentions, behaviors, and putative boundaries toward OS across different status groups (i. e., predocs, postdocs, (junior) professors). Based on the limited sample size, the descriptive findings indicated that OS already plays an important role in the community as researchers generally reported that they deem it important to design their research processes openly and transparently, that they are willing to (increasingly) do so in the future, and that they already engage in relevant OS behaviors, overall painting a promising picture.

Although our survey revealed interesting findings, note that they come with important limitations. First, the sample was unfortunately limited in size and, thus, only allowed descriptive analyses. Second, the sample composition may be rather selective and, because the general agreement on OS was relatively high, might suggest a positive selection of those scientists who value OS and seem to already incorporate such practices into their research processes over those who do not. This may make the picture painted more promising than a broader sampling might have revealed. Third, although the participants’ distribution across status groups was relatively equal, they were rather underrepresented given the absolute number of predocs and postdocs in the population of German-speaking sports psychologists. Fourth, note that our survey did not incorporate open-access publishing (i. e., OA) as part of OS. In fact, OA may be seen as a crucial aspect of OS, which aspires to the public disclosure of all facets of information as early as feasible in the scientific process (Nielsen, 2011) to all fellow researchers and the broad public without any limitations. However, we did not include OA publishing as it is often not an active selection of authors but rather dependent on journals offering this option and the financial resources available to the author group. Accordingly, OA publishing is not per se an active choice OS practice. Given these limitations, we would greet future studies addressing them and providing a more comprehensive and representative overview of OS in the German-speaking sport-psychology community.

Attitudes, Intentions, and Behavior: Existent Gaps in German Sport Psychology

Although the picture our data painted was overall promising, at the same time, we identified four aspects that we consider important. First, researchers generally hold positive attitudes and intentions toward OS. Items regarding global OS attitudes and intentions (e. g., OS is good for science), however, received comparatively greater approval than items regarding rather specific attitudes and intentions (e. g., registered reports as publication format) and more individual consequences of OS (e. g., OS should be considered as a criterion in selection processes; OS should be considered as a criterion in publishing processes). These initial findings indicate that, although OS is generally valued, these ideas have not yet translated into specific OS attitudes and intentions, which, according to the TPB, are more likely to shape specific behaviors. Therefore, a change of specific attitudes might be a promising avenue to increasing researchers’ intentions toward OS and, in turn, increasing OS behavior in their research practices in the future.

Second, on the one hand, researchers reported a strong congruence between attitudes and intentions toward OS, whereas, on the other hand, their OS-related behaviors fell comparatively behind. Descriptively, average attitudes and intentions consistently exceeded researchers’ reports of their current behavior, indicating room for improvement in current research practices on average, especially on the individual behavioral level. Thus, our initial results speak for the existence of an actual attitude/intention-behavior gap. This conclusion is supported not only by the fact that less than half of all participants reported having an account on the OSF or other OS-related repositories, but also by the fact that researchers currently tend to incorporate those OS-related practices into their research processes, which appear to be more easily incorporated into the, what we call, classic research process (e. g., publication of data, code, and material accompanying papers). By contrast, they reported a comparatively greater reluctance toward those OS practices that change the temporal flow of the classic process (e. g., registered reports). Hence, we assume that actions associated with higher anticipated extra costs or deviations from traditions and routines provide a greater barrier (e. g., time-costs because of extra work changing the workflow): A necessary change of thinking and habits might thus push researchers outside their scientific comfort zones. Researchers should be encouraged to simply try out different OS methods. Here, workshops or coffee lectures (i. e., short informative (online) formats for knowledge transfer) might be a good choice to show them and get them to try different OS platforms or strategies. Chairholders could offer such informative workshops and make them mandatory for their research staff to attend or could organize them and/or attend themselves. This might reduce first fears, and guided instructions might ease initial practice. Further, a reward system might help induce OS practices as a prerequisite, for example, if they were integrated as a supposition in Ph.D. or Habilitation regulations, for applications for (tenured) professorships, etc. Taken together, reducing barriers to the use of OS requires efforts at multiple levels – at the individual level, the departmental level, the institutional level, and by journals and funding agencies (Gownaris et al., 2022; Robson et al., 2021). Once researchers are educated in these practices, they can help promote OS through low-effort sharing in seminars, journal clubs, or discussions with colleagues.

Third, aiming to explain the revealed attitude/intention-behavior gap, our results highlight that researchers mainly perceive time constraints as a limiting factor. This finding agrees with other recent investigations, revealing that criticism of OS is related primarily to the increase in work-related stress and the overall duration of the project (Sarafoglou et al., 2022). These barriers may reflect the (erroneous) conviction that OS takes immense time – which is not necessarily the case. The engagement in OS does shift efforts and invested time from the end of the classic research process (e. g., deciding what to analyze and how) to the initial phases of the research process. Often, however, the later phases of a research process are massively speeded up by predefined hypotheses and a set of analyses to conduct. These barriers may also reflect the researchers’ partly valid conviction that scientific success largely depends on the quantity of their scientific output. Typically, scientific output is judged by the number of publications and by the monetary volume of grants but not necessarily also (or unfortunately only to a lesser extent) by qualitative variables, incorporating scientific rigor and adherence to OS standards. Moreover, researchers reported that they would only be more willing to incorporate OS in their research processes if all others were forced to do so as well, suggesting a lack of a consistent social norm within the field. Here, we anticipate a sensitivity to individual disadvantages by adhering to OS standards, which is further underlined by the researchers’ responses to the questions whether scientists adhering to OS standards should be preferred in academic job selection processes, whether data and code sharing should be mandatory for publishing and funding, and whether preregistered studies should be preferred over nonpreregistered ones. By contrast, when asked directly, researchers did not report a strong fear of competitive disadvantages because of doing research more openly and transparently and also not because of the detection of potential mistakes in their analysis scripts after publishing data and code on repositories. Although researchers generally valued OS principles, they assumed they personally benefitted from them to a comparatively lesser extent. Finally, note that researchers reported lacking a concrete idea on how to engage in OS and respective training opportunities; remarkably, this was similarly the case irrespective of whether or not researchers had already engaged in OS practices.

Fourth, it is worth noting that differences between status groups were generally low, indicating that the academic age and the respective career stage may not be critical explanatory variables for determining differences in OS attitudes, intentions, and behaviors. However, we must interpret even that comparison in the light of the limited and somewhat selected sample, which therefore warrants further investigation.

A View Beyond the End of Our Nose

Sport psychology is, of course, not the only field of studies where OS still lags behind. Martin Hagger (2022) proposed that researchers adopt an open science “mindset,” that is, “a ‘farm to fork’ approach” in health psychology and behavioral medicine. Current intention- (or knowledge–) action gaps are proposed in a variety of scientific fields, for example, biology (Roche et al., 2022), sociology (Breznau, 2021), or informatics (Brinkhaus et al., 2023). A recent systematic investigation among Early Career Researches supports our findings, indicating that “the most frequently discussed barriers across the OS life cycle were a lack of awareness and training, prohibitively high time commitments, and restrictions and/or a lack of incentives by supervisors” (Gownaris et al., 2022, p. 1). Hence, our following suggestions and call-to-action can not only be relevant to (German) sport psychology but may be transferable to many other research fields.

What to Do and Where to Go from Here?

Based on the current developments and these descriptive findings, we conclude that, from our perspective, OS has to and will become the norm in future empirical sport-psychological research, not only to overcome the replication crisis or crisis of confidence but also as a means of increasing overall scientific quality. If this movement is indispensable and irreversible, we should provide the field with a reliable and fair normative and reward system – and we advocate: the earlier, the better.

So, what to do and where to go from here? Because the system is defined at four levels, they may serve as starting points for change toward OS: the individual level, the work group and departmental level, the institutional level, and journals and funding agencies (Robson et al., 2021). All actors on these levels have the opportunity and the responsibility of pushing the field forward. We deem those actors in leading positions, those with power, especially responsible. That is why we would like to close our appeal for incorporating more OS into our daily work as members of the sport-psychology community with a famous citation: “With great power comes great responsibility” (cf. Lee & Ditko, 1962, Amazing Fantasy No. 15: “Spider-Man,” p. 13). Accordingly, we invite everyone, especially those with administrative authority, to define the system in a way that accelerates the process toward OS and incorporates OS as a core value. This regards us as actors and peers in the scientific field, such as editors, reviewers, authors, coauthors, supervisors, cooperation partners, members of (selection) committees, and lecturers. In these responsible and powerful roles, we should increasingly promote OS, support achieving OS expertise, and value respective efforts. Because only by changing the field together to become one in which OS is the valued and established norm can we resolve uncertainties regarding time allocation (“Everyone does OS, so I do it, too.”), time conflicts (“OS is important and valued, so an engagement is worthwhile.”), and uncertainties regarding the current reward system (“OS will pay off.”). According to the TPB, such changes will necessarily translate into increased attitudes and intentions as well as into researchers seeking out and booking more training opportunities, increasing their OS-related skills and behaviors, becoming role models, and working together to improve the overall quality of our science.

A final note: First, we are fully aware that adherence to OS may not necessarily directly increase the quality of the science within a field; but it will do so at least indirectly. The publication of data, code, and materials and the anticipated evaluations of these files will trigger conscientious preparation and documentation and might also trigger critical reflections within the research process. The quality of the research process may also be improved directly, for example, through more salient considerations of statistical power and measurement procedures and through publications of registered reports placing the peer-review process before the data collection. The valuable feedback of reviewers can still be incorporated into the planned study design, and replicable findings present a more thorough foundation for future research.

Second, sport psychology represents a heterogenous field in terms of examined samples, study designs, research methods, and considered data sources, so that a “one-size-fits-all-OS-solution” is unlikely to do the trick. That is why we would like to highlight that we expect this heterogeneity also to be reflected in heterogeneous approaches in the adherence to OS standards.

Third, the move toward more OS will not come overnight. Changing habits, leaving the comfort zone, and implementing new routines may take time. Therefore, we encourage researchers to simply proceed step by step: define a first OS research project, write a first preregistration, try out a registered report for the next data collection – those are the valid starting points. This way, we hope that researchers will quickly learn that the benefits of OS outweigh the (anticipated) costs and barriers.

Conclusion and a Call to Action

This sport-psychological position paper advocates a greater investment in OS practices. Based on data from the German-speaking sport-psychological community and in line with the TPB, our survey results indicate that researchers in the field hold positive OS attitudes and intentions, but that their current OS behavior is still lacking. Time allocation seems currently to be the greatest barrier, which the field can overcome only by providing researchers with a reliable and fair reward system that values scientific rigor, quality, and OS over mere quantitive indicators. This will help OS to become the norm in the field, as, for example, suggested by Schönbrodt and colleagues (Schönbrodt et al., 2022), who present four principles of responsible research assessment in hiring and promotion within their DORA framework. We should use our inherent power and accept our responsibility to advance OS within the field, even if the advancement starts with a small step.

Electronic Supplementary Material

The electronic supplementary material is available with the online version of the article at https://doi.org/10.1026/1612-5010/a000406

Literatur

  • Ajzen, I. (1985). From intentions to actions: A theory of planned behavior. In J. KuhlJ. Beckmann (Eds.), Action control (pp. 11 – 39). Springer. https://doi.org/10.1007/978-3-642-69746-3_2 First citation in articleCrossrefGoogle Scholar

  • Asendorpf, J. B., Conner, M., Fruyt, F. de, Houwer, J. de, Denissen, J., Fiedler, K., Fiedler, S., Funder, D. C., Kliegl, R., Nosek, B. A., Perugini, M., Roberts, B. W., Schmitt, M., van Aken, M., Weber, H., & Wicherts, J. M. (2013). Replication is more than hitting the lottery twice.Retrieved from https://biblio.ugent.be/publication/3106937/file/3106962 First citation in articleGoogle Scholar

  • Bacon, R. (1859). Fr. Rogeri Bacon Opera quædam hactenus inedita. Vol. I. containing I. – Opus tertium. II. – Opus minus III. – CompenCompendium philosophiæ. Longman, Green, Longman and Roberts. http://books.google.com/books?id=wMUKAAAAYAAJ First citation in articleGoogle Scholar

  • Beauchamp, M. R. (2023). Advancing open science and methodological rigor in sport, exercise, and performance psychology. Sport, Exercise, and Performance Psychology, 12 (1), 1 – 8. https://doi.org/10.1037/spy0000317 First citation in articleCrossrefGoogle Scholar

  • Braude, S. E. (1979). ESP and psychokinesis. A philosophical examination. Temple University Press. First citation in articleGoogle Scholar

  • Breznau, N. (2021). Does sociology need open science? Societies, 11 (1), 9 https://doi.org/10.3390/soc11010009 First citation in articleCrossrefGoogle Scholar

  • Brinkhaus, H. O., Rajan, K., Schaub, J., Zielesny, A., & Steinbeck, C. (2023). Open data and algorithms for open science in AI-driven molecular informatics. Current Opinion in Structural Biology, 79, 102542. https://doi.org/10.1016/j.sbi.2023.102542 First citation in articleCrossrefGoogle Scholar

  • Bullock, G. S., Ward, P., Peters, S., Arundale, A. J. H., Murray, A., Impellizzeri, F. M., & Kluzek, S. (2022). Call for open science in sports medicine. British Journal of Sports Medicine.56 (20), 1143 – 1144. https://doi.org/10.1136/bjsports-2022-105719 First citation in articleCrossrefGoogle Scholar

  • Burke, N. L., Frank, G. K. W., Hilbert, A., Hildebrandt, T., Klump, K. L., Thomas, J. J., Wade, T. D., Walsh, B. T., Wang, S. B., & Weissman, R. S. (2021). Open science practices for eating disorders research. The International Journal of Eating Disorders, 54 (10), 1719 – 1729. https://doi.org/10.1002/eat.23607 First citation in articleCrossrefGoogle Scholar

  • Caldwell, A. R., Vigotsky, A. D., Tenan, M. S., Radel, R., Mellor, D. T., Kreutzer, A., Lahart, I. M., Mills, J. P., & Boisgontier, M. P. (2020). Moving sport and exercise science forward: A call for the adoption of more transparent research practices. Sports Medicine (Auckland, N.Z.), 50 (3), 449 – 459. https://doi.org/10.1007/s40279-019-01227-1 First citation in articleCrossrefGoogle Scholar

  • Chang, A. C., & Li, P. (2015). Is economics research replicable? Sixty published papers from thirteen journals say ”usually not.” Finance and Economics Discussion Series, 2015 (83), 1 – 26. https://doi.org/10.17016/FEDS.2015.083 First citation in articleCrossrefGoogle Scholar

  • Deutsche Forschungsgemeinschaft (DFG). (2022). Proposal preparation instructions. Project proposals. https://www.dfg.de/formulare/54_01/54_01_en.pdf First citation in articleGoogle Scholar

  • Dreiskämper, D. (2016). Die „Vertrauenskrise“ der empirischen Sozialwissenschaften und deren Bedeutung für die Sportpsychologie [The “Trust Crisis” of Empiric Social Science and Its Meaning for Sport and Exercise Psychology: A Comment from the Trust Research Perspective]. Zeitschrift für Sportpsychologie, 23 (3), 92 – 98. https://doi.org/10.1026/1612-5010/a000168 First citation in articleLinkGoogle Scholar

  • Errington, T. M., Iorns, E., Gunn, W., Tan, F. E., Lomax, J., & Nosek, B. A. (2014). An open investigation of the reproducibility of cancer biology research. ELife, 3:e04333.https://doi.org/10.7554/eLife.04333 First citation in articleCrossrefGoogle Scholar

  • Foster, E. D., & Deardorff, A. (2017). Open science framework (OSF). Journal of the Medical Library Association, 105 (2) https://doi.org/10.5195/jmla.2017.88 First citation in articleCrossrefGoogle Scholar

  • Geukes, K., Schönbrodt, F. D., Utesch, T., Geukes, S., & Back, M. D. (2016). Wege aus der Vertrauenskrise [Ways Out of the Crisis of Confidence: Individual Steps Toward a Reliable and Open Science]. Zeitschrift für Sportpsychologie, 23 (3), 99 – 109. https://doi.org/10.1026/1612-5010/a000167 First citation in articleLinkGoogle Scholar

  • Gownaris, N. J., Vermeir, K., Bittner, M.–I., Gunawardena, L., Kaur-Ghumaan, S., Lepenies, R., Ntsefong, G. N., & Zakari, I. S. (2022). Barriers to full participation in the open science life cycle among early career researchers. Data Science Journal, 21 (1), 2 https://doi.org/10.5334/dsj-2022-002 First citation in articleCrossrefGoogle Scholar

  • Hagger, M. S. (2022). Developing an open science “mindset.” Health Psychology and Behavioral Medicine, 10 (1), 1 – 21. https://doi.org/10.1080/21642850.2021.2012474 First citation in articleCrossrefGoogle Scholar

  • Hardwicke, T. E., & Ioannidis, J. P. A. (2018). Mapping the universe of registered reports. Nature Human Behaviour, 2 (11), 793 – 796. https://doi.org/10.1038/s41562-018-0444-y First citation in articleCrossrefGoogle Scholar

  • Hicks, D. J. (2021). Open science, the replication crisis, and environmental public health. Accountability in Research, 30 (1), 34 – 62. https://doi.org/10.1080/08989621.2021.1962713 First citation in articleCrossrefGoogle Scholar

  • Hoffmann, S., Schönbrodt, F., Elsas, R., Wilson, R [Rory], Strasser, U., & Boulesteix, A.–L. (2021). The multiplicity of analysis strategies jeopardizes replicability: Lessons learned across disciplines. Royal Society Open Science, 8 (4), 201925 https://doi.org/10.1098/rsos.201925 First citation in articleCrossrefGoogle Scholar

  • Jago, R., & van der Ploeg, H. (2018). Open science for nutrition and physical activity research: A new challenge and lots of opportunities for IJBNPA. The International Journal of Behavioral Nutrition and Physical Activity, 15 (1), Article 105 https://doi.org/10.1186/s12966-018-0739-4 First citation in articleCrossrefGoogle Scholar

  • Jasny, B. R., Chin, G [Gilbert], Chong, L., & Vignieri, S. (2011). Data replication & reproducibility: Again, and again, and again … Introduction. Science, 334 (6060), 1225 https://doi.org/10.1126/science.334.6060.1225 First citation in articleCrossrefGoogle Scholar

  • Kuhn, T. (1962). The structure of scientific revolutions. University of Chicago Press. First citation in articleGoogle Scholar

  • Lee, S., & Ditko, S. (1962). Amazing Fantasy #15: Spider-Man! Marvel. First citation in articleGoogle Scholar

  • Munafò, M. R. (2016). Open science and research reproducibility. Ecancermedicalscience, 10 ed56 https://doi.org/10.3332/ecancer.2016.ed56 First citation in articleCrossrefGoogle Scholar

  • Munafò, M. R., Chambers, C., Collins, A., Fortunato, L., & Macleod, M. (2022). The reproducibility debate is an opportunity, not a crisis. BMC Research Notes, 15 (1), Article 43 https://doi.org/10.1186/s13104-022-05942-3 First citation in articleCrossrefGoogle Scholar

  • Nielsen, M. (2011). Reinventing discovery. Princeton University Press. https://doi.org/10.2307/j.ctt7s4vx First citation in articleCrossrefGoogle Scholar

  • Norris, E., Sulevani, I., Finnerty, A. N., & Castro, O. (2022). Assessing Open Science practices in physical activity behaviour change intervention evaluations. BMJ Open Sport & Exercise Medicine, 8 (2), e001282. https://doi.org/10.1136/bmjsem-2021-001282 First citation in articleCrossrefGoogle Scholar

  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., & . . . Yarkoni, T. (2015). Scientific standards: Promoting an open research culture. Science, 348 (6242), 1422 – 1425. https://doi.org/10.1126/science.aab2374 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration. (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7 (6), 657 – 660. https://doi.org/10.1177/1745691612462588 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349 (6251), aac4716 https://doi.org/10.1126/science.aac4716 First citation in articleCrossrefGoogle Scholar

  • Pashler, H., & Wagenmakers, E.–J. (2012). Editors’ introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on Psychological Science, 7 (6), 528 – 530. https://doi.org/10.1177/1745691612465253 First citation in articleCrossrefGoogle Scholar

  • Popper, K. (1992). The logic of scientific discovery. Routledge. First citation in articleGoogle Scholar

  • Powers, S. M., & Hampton, S. E. (2019). Open science, reproducibility, and transparency in ecology. Ecological Applications, 29 (1), e01822. https://doi.org/10.1002/eap.1822 First citation in articleCrossrefGoogle Scholar

  • Renkewitz, F., & Heene, M. (2019). The replication crisis and open science in psychology. Zeitschrift für Psychologie, 227 (4), 233 – 236. https://doi.org/10.1027/2151-2604/a000389 First citation in articleLinkGoogle Scholar

  • Robson, S. G., Baum, M. A., Beaudry, J. L., Beitner, J., Brohmer, H., Chin, J. M., Jasko, K., Kouros, C. D., Laukkonen, R. E., Moreau, D., Searston, R. A., Slagter, H. A., Steffens, N. K., Tangen, J. M., & Thomas, A. (2021). Promoting open science: A holistic approach to changing behaviour. Collabra: Psychology, 7 (1), 30137 https://doi.org/10.1525/collabra.30137 First citation in articleCrossrefGoogle Scholar

  • Roche, D. G., O’Dea, R. E., Kerr, K. A., Rytwinski, T., Schuster, R., Nguyen, V. M., Young, N., Bennett, J. R., & Cooke, S. J. (2022). Closing the knowledge-action gap in conservation with open science. Conservation Biology, 36 (3), e13835. https://doi.org/10.1111/cobi.13835 First citation in articleCrossrefGoogle Scholar

  • Sarafoglou, A., Kovacs, M., Bakos, B., Wagenmakers, E.–J., & Aczel, B. (2022). A survey on how preregistration affects the research workflow: Better science but more work. Royal Society Open Science, 9 (7), 211997 https://doi.org/10.1098/rsos.211997 First citation in articleCrossrefGoogle Scholar

  • Schneider, J. (2022). Applicability of open science practices to completed research projects from different disciplines and research paradigms. F1000Research, 11, 408. https://doi.org/10.12688/f1000research.111383.2 First citation in articleCrossrefGoogle Scholar

  • Schönbrodt, F., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D., Le Phan, V., Schmitt, M., Scheel, A. M., Schubert, A.–L., Steinberg, U., & Leising, D. (2022). Responsible research assessment I: Implementing DORA for hiring and promotion in psychology. Retrieved from https://doi.org/10.23668/PSYCHARCHIVES.8162 First citation in articleGoogle Scholar

  • Schönbrodt, F. D., Gollwitzer, M., & Abele-Brehm, A. (2017). Der Umgang mit Forschungsdaten im Fach Psychologie: Konkretisierung der DFG-Leitlinien [Dealing with research data in psychology: concretization of the DFG guidelines]. Psychologische Rundschau, 68 (1), 20 – 35. https://doi.org/10.1026/0033-3042/a000341 First citation in articleLinkGoogle Scholar

  • Schönbrodt, F. D., & Scheel, A. (2017). FAQ zu Open Data und Open Science in der Sportpsychologie [FAQ on Open Data and Open Science in sports psychology]. Zeitschrift für Sportpsychologie, 24 (4), 134 – 139. https://doi.org/10.1026/1612-5010/a000217 First citation in articleLinkGoogle Scholar

  • Stürmer, S., Oeberst, A., Trötschel, R., & Decker, O. (2017). Early-career researchers’ perceptions of the prevalence of questionable research practices, potential causes, and open science. Social Psychology, 48 (6), 365 – 371. https://doi.org/10.1027/1864-9335/a000324 First citation in articleLinkGoogle Scholar

  • Tamminen, K. A., & Poucher, Z. A. (2018). Open science in sport and exercise psychology: Review of current approaches and considerations for qualitative inquiry. Psychology of Sport and Exercise, 36, 17 – 28. https://doi.org/10.1016/j.psychsport.2017.12.010 First citation in articleCrossrefGoogle Scholar