Forschungstransparenz als hohes wissenschaftliches Gut stärken
Konkrete Ansatzmöglichkeiten für Psychologische Institute
Abstract
Zusammenfassung. Groß angelegte Replikationsprojekte der letzten Jahre legen ein aus unserer Sicht beunruhigendes Ausmaß an nicht-replizierbaren Befunden in der wissenschaftlichen Literatur nahe, sowohl in der Psychologie als auch in anderen Disziplinen. Basierend auf einer Analyse einiger Ursachen dieser Situation argumentieren wir, dass der Wandel hin zu einer offenen Wissenschaft („Open Science“) eine Konsequenz aus der Glaubwürdigkeitskrise sein muss. Wir plädieren für konkrete und machbare Änderungen in den Arbeitseinheiten und Instituten vor Ort, und zeigen exemplarisch, welche Schritte am Department Psychologie der Ludwig-Maximilians-Universität München umgesetzt wurden. Diese Schritte betreffen Anreizstrukturen, die Forschungskultur, die Lehre und die Verzahnung mit der Ethikkommission. Sie haben das Ziel, eine reproduzierbarere und glaubwürdigere Forschung zu unterstützen, ohne unnötige bürokratische Belastungen zu erzeugen.
Abstract. Recent large-scale replication projects suggest an amount of nonreplicable results in the scientific literature, in psychology but also in other sciences, which is concerning from our point of view. We analyze some causes for this situation, and argue that the change toward more research transparency (“open science”) must be one consequence that should be drawn from the credibility crisis. We call for feasible changes in the local research units and departments and show as an example the steps that have been taken at the Department of Psychology of the Ludwig-Maximilians-Universität München. These changes concern incentive structures, research culture, teaching, and a close integration with the local ethics committee. The goal is to foster a more credible and more reproducible research output without generating unnecessary bureaucratic overhead.
Literatur
2016). Überlegungen zur Optimierung von Berufungsverfahren in der Psychologie. Psychologische Rundschau, 67, 262 – 268.
(2007). Normative dissonance in science: Results from a national survey of U.S. Scientists. Journal of Empirical Research on Human Research Ethics, 2, 3 – 14. https://doi.org/10.1525/jer.2007.2.4.3
(2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27, 108 – 119. https://doi.org/10.1002/per.1919
(2012). The rules of the game called psychological science. Perspectives on Psychological Science, 7, 543 – 554. https://doi.org/10.1177/1745691612459060
(2012). Drug development: Raise standards for preclinical cancer research. Nature, 483, 531 – 533. https://doi.org/10.1038/483531a
(2016). The GRIM test: A simple technique detects numerous anomalies in the reporting of results in psychology. Social Psychological and Personality Science, 8 (4), 363 – 369. https://doi.org/10.1177/1948550616673876
(2014). Instead of „playing the game“ it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1, 4 – 17.
(2015). Is economics research replicable? Sixty published papers from thirteen journals say „usually not“. Finance and Economics Discussion Series 2015 – 083. https://doi.org/10.17016/FEDS.2015.083
(2016). Spokesperson clarifies that American Psychological Association does not endorse routine data sharing. Retrieved from https://jcoynester.wordpress.com/2016/12/04/spokesperson-clarifies-that-american-psychological-association-does-not-endorse-routine-data-sharing/
(December 4,2015). Scientific disintegrity as a public bad. Perspectives on Psychological Science, 10, 361 – 379. https://doi.org/10.1177/1745691615577865
(2016). Lessons from a decade of replications at the Quarterly Journal of Political Science. Political Science & Politics, 49, 273 – 276. https://doi.org/10.1017/S1049096516000196
(2016). A quixotic quest to obtain a dataset on media violence with an unexpected price tag. Retrieved from https://jcoynester.wordpress.com/2016/11/29/a-quixotic-quest-to-obtain-a-dataset-on-media-violence-with-an-unexpected-price-tag/
(November 29,2015). Questionable research practices revisited. Social Psychological and Personality Science, 7 (1), 45 – 52. https://doi.org/10.1177/1948550615612150
(2015). Empfehlungen der DGPs-Kommission „Qualität der psychologischen Forschung“. Verfügbar unter https://www.dgps.de/index.php?id=143&tx_ttnews[tt_news]=1669&cHash=ff3ae479fbbfa2ca45aebf7b4db8e504
(2014). The statistical crisis in science. American Scientist, 102, 460. https://doi.org/10.1511/2014.111.460
(2015). A powerful nudge? Presenting calculable consequences of underpowered research shifts incentives toward adequately powered designs. Social Psychological and Personality Science, 6 (7), 847 – 854. https://doi.org/10.1177/1948550615584199
(2012). Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Perspectives on Psychological Science, 7, 562 – 571.
(2016). Belastbare und effiziente Wissenschaft: Strategische Ausrichtung von Forschungsprozessen als Weg aus der Replikationskrise. Hagen: FernUniversität Hagen.
(2016). Current incentives for scientists lead to underpowered studies with erroneous conclusions. PLOS Biology, 14, e2000995. https://doi.org/10.1371/journal.pbio.2000995
(2014). Handling the fragile vase of scientific practices. Addiction, 110, 9 – 10. https://doi.org/10.1111/add.12720
(2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524 – 532. https://doi.org/10.1177/0956797611430953
(2014). Investigating variation in replicability: A „many labs“ replication project. Social Psychology, 45, 142 – 152. https://doi.org/10.1027/1864-9335/a000178
(2014). What is the value of social science? Challenges for researchers and government funders. Political Science & Politics, 47, 1 – 7. https://doi.org/10.1017/S1049096513001613
(1942). A note on science and democracy. Journal of Legal and Political Sociology, 1, 115 – 126.
(2016). The Peer Reviewers’ Openness Initiative: Incentivizing open research practices through peer review. Royal Society Open Science, Published online January 13, 2016. https://doi.org/10.1098/rsos.150547
(2012). Let’s publish fewer papers. Psychological Inquiry, 23, 291 – 293. https://doi.org/10.1080/1047840X.2012.705245
(2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45, 137 – 141. https://doi.org/10.1027/1864-9335/a000192
(2015). Promoting an open research culture. Science, 348, 1422 – 1425. https://doi.org/10.1126/science.aab2374
(2012). Scientific utopia II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615 – 631. https://doi.org/10.1177/1745691612459058
(2015). Estimating the reproducibility of psychological science. Science, 349, aac4716. https://doi.org/10.1126/science.aac4716
(2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug Discovery, 10, 712 – 712. https://doi.org/10.1038/nrd3439-c1
(2016). Introducing the p-hacker app: Train your expert p-hacking skills. Retrieved from http://www.nicebread.de/introducing-p-hacker/
(June 21,2016). Entwurf für DGPs-Richtlinien zum Umgang mit Forschungsdaten. Verfügbar unter https://www.dgps.de/uploads/media/DGPs-Richtlinien_Forschungsdaten_22_01_16.pdf
(2015). Commitment to research transparency. Retrieved from http://www.researchtransparency.org
(2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359 – 1366.
(2015). Specification curve: Descriptive and inferential statistics on all reasonable specifications. Retrieved from https://papers.ssrn.com/abstract=2694998
(2016). The natural selection of bad science. Retrieved from http://arxiv.org/abs/1605.09511
(2016). Increasing transparency through a multiverse analysis. Perspectives on Psychological Science, 11, 702 – 712. https://doi.org/10.1177/1745691616658637
(1963). On aims and methods of ethology. Zeitschrift für Tierpsychologie, 20, 410 – 433.
(2015). Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra, 1, 1 – 5. https://doi.org/10.1525/collabra.13
(2012). An agenda for purely confirmatory research. Perspectives on Psychological Science, 7, 632 – 638. https://doi.org/10.1177/1745691612463078
(2011). Willingness to share research data is related to the strength of the evidence and the quality of reporting of statistical results. PLoS ONE, 6, e26828. https://doi.org/10.1371/journal.pone.0026828
(2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61, 726 – 728. https://doi.org/10.1037/0003-066X.61.7.726
(