Die Vertrauenskrise empirischer Forschung in der Psychologie
Ausgewählte Ursachen und exemplarische Lösungsvorschläge für die sportpsychologische Forschung
Abstract
Zusammenfassung. Ziel des vorliegenden Artikels ist es darzustellen, warum aktuell eine Vertrauenskrise der psychologischen Forschung postuliert wird. Wir beschreiben die Anzeichen der Vertrauenskrise bevor wir uns ihren mutmaßlichen Ursachen zuwenden: Researcher Degrees of Freedom, kleine Stichprobengrößen, Analyse- und Publikationspraktiken. Zusammen genommen haben diese Faktoren Anlass gegeben, die Reproduzierbarkeit psychologischer Forschung anzuzweifeln. Anschließend präsentieren wir ausgewählte Vorschläge, wie die Reproduzierbarkeit psychologischer Forschung erhöht werden kann. Mit dem vorliegenden Artikel möchten wir zu einer Forschungskultur innerhalb der Sportpsychologie beitragen, die geprägt ist von Reproduzierbarkeit, Transparenz und Vertrauen.
Abstract. We present the discussion on the crisis of confidence in psychological science. We first briefly describe what has led to this so-called crisis of confidence before outlining the assumed causes for the crisis: researcher degrees of freedom, small sample sizes, data analysis methods, and publication practices. In combination, these factors have led to justified doubts about the reproducibility of published research findings in psychology. Subsequently, we present suggestions to increase the reproducibility of research findings. With this article, we hope to contribute to a research culture of reproducibility, transparency, and trust within the field of sport and exercise psychology.
Literatur
(2012). The rules of the game called psychological science. Perspectives on Psychological Science, 7, 543– 554. doi: 10.1177/1745691612459060
(2013). Power failure: Why small sample sizes undermine the reliability of neuroscience. Nature Reviews Neuroscience, 14, 365– 376. doi: 10.1038/nrn3475
(2012a). The secret lives of experiments: Methods reporting in the fMRI literature. NeuroImage, 63, 289– 300. doi: 10.1016/j.neuroimage.2012.07.004
(2012b). On the plurality of (methodological) worlds: Estimating the analytic flexibility of fMRI experiments. Frontiers in Neuroscience, 6, 149. doi: 10.3389/fnins.2012.00149
(1962). The statistical power of abnormal-social psychological research: A review. Journal of Abnormal Social Psychology, 65, 145– 153. doi: 10.1037/h0045186
(1992). A power primer. Psychological Bulletin, 112, 155– 159. doi: 10.1037/0033-2909.112.1.155
(2014). Welcome back theory! Perspectives on Psychological Science, 9, 71– 75.
(2015). Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Retrieved from https://osf.io/s59bg/
, et al.(2015). Replication, falsification, and the crisis of confidence in social psychology. Frontiers in Psychology, 6, 621. doi: 10.3389/fpsyg.2015.00621
(2007). G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175– 191. doi: 10.3758/BF03193146
(2012). The long way from α-error control to validity proper : Problems with a short-sighted false-positive debate. Perspectives on Psychological Science, 7, 661–669.
(2014). The N-Pact Factor: Evaluating the quality of empirical journals with respect to sample size and statistical power. PLoS One, 9, e109019. doi: 10.1371/journal.pone.0109019
(2012). Publication bias and the failure of replication in experimental psychology. Psychonomic Bulletin and Review, 19, 975–991. doi: 10.3758/s-y
(2014). Beyond power calculations: Assessing Type S (Sign) and Type M (Magnitude) errors. Perspectives on Psychological Science, 9, 641–651. doi: 10.1177/1745691614551642
(2009). Of beauty, sex, and power. Too little attention has been paid to the statistical challenges in estimating small effects. American Scientist, 97, 310–316.
(2016). Wege aus der Vertrauenskrise. Individuelle Schritte hin zu verlässlicher und offener Forschung. Zeitschrift für Sportpsychologie, 23, 99–109.
(2014). Robust misinterpretation of confidence intervals. Psychonomic Bulletin & Review, 21, 1157–1164.
(2005). Why most published research findings are false. PLoS Medicine, 2, e124.
Neuroskeptic (2012). The nine circles of scientific hell. Perspectives on Psychological Science, 7, 643– 644.(2014). Replications of important results in social psychology [Special issue]. Social Psychology, 45. doi: 10.1027/1864-9335/a000192
(Eds.)(2012). Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615–631. doi: 10.1177/1745691612459058
Open Science Collaboration . (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7, 657–660. doi: 10.1177/1745691612462588Open Science Collaboration . (in press). Maximizing the reproducibility of your research. In S. O. Lilienfeld & I. D. Waldman (Hrsg.), Psychological Science Under Scrutiny: Recent Challenges and Proposed Solutions. New York, NY: Wiley.Open Science Collaboration . (2015). Estimating the reproducibility of psychological science. Science, 349, aac4716. doi: 10.1126/science.aac4716(2012). Special section on replicability in psychological science: A crisis of confidence? [Special section]. Perspectives on Psychological Science, 7, 528–654. doi: 10.1177/1745691612465253
(Eds.).(1959). The Logic of scientific discovery. London: Hutchinson.
(2003). One hundred years of social psychology quantitatively described. Review of General Psychology, 7, 331–363. doi: 10.1037/1089-2680.7.4.331
(2014). An ethical approach to peeking at data. Perspectives on Psychological Science, 9, 293e304. doi: 10.1177/1745691614528214
(2016). Reproducible research in sport and exercise psychology: The role of sample sizes. Psychology of Sport and Exercise, 23, 114–122. doi: 10.1016/j.psychsport.2015.11.005
(1989). Do studies of power have an effect on the power of studies? Psychological Bulletin, 105, 309–316. doi: 10.1037/0033-2909.105.2.309
(2011). False-Positive Psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366.
(2012). Special section on research practices [Special section]. Perspectives on Psychological Science, 7, 655–689. doi: 10.1177/1745691612465075
(Ed.).(2014). Expectations for replications: Are yours realistic? Perspectives on Psychological Science, 9, 305–318.
(2014). The alleged crisis and the illusion of exact replication. Perspectives on Psychological Science, 9, 59–71.
(1977). Exploratory data analysis. Reading, MA: Addison-Wesley.
(2015). Meta-analyses are no substitute for registered replications: A skeptical perspective on religious priming. Frontiers in Psychology, 6, 1365.
(2015). A power fallacy. Behavior Research Methods. doi: 10.3758/s-4
(2011). Why psychologists must change the way they analyze their data: The case of psi: Comment on Bem (2011). Journal of Personality and Social Psychology, 100, 426– 432.
(2012). An agenda for purely confirmatory research. Perspectives on Psychological Science, 7, 632– 638. doi: 10.1177/1745691612463078