Skip to main content
Published Online:https://doi.org/10.1026/0033-3042/a000384

Zusammenfassung. Die Replikationskrise innerhalb der Psychologie hat eine Diskussion über gängige Praktiken im Forschungsprozess und die beteiligten Institutionen ausgelöst. Wir stellen Maßnahmen vor, die dazu beitragen können, Forschung effizienter und Forschungsergebnisse belastbarer zu machen, indem sie eine strategische Ausrichtung und stärkere Verzahnung von Forschungsprozessen in Gang setzen. Die notwendigen Veränderungen setzen dabei auf der Ebene des Individuums sowie der Institutionen an und betreffen die Inhaltsbereiche Theorie, Empirie und Evidenzakkumulation. Über die bekannten eher spezifischen Maßnahmen zur Verbesserung der Reproduzierbarkeit hinaus, zielen unsere Vorschläge darauf ab, die Voraussetzungen für effizientes wissenschaftliches Arbeiten zu verbessern, indem u. a. (i) die exakte Spezifikation und die kritische Testung und Revision von Theorien wieder stärker in den Mittelpunkt der Forschung gerückt werden, (ii) eine Kultur der Transparenz, akzeptierten Fehlbarkeit und Offenheit für Empirie- aber insbesondere auch Theorierevision etabliert wird, (iii) diese Kultur und die damit verbundene Methodik als integrale Bestandteile durchgängig in der Lehre verankert werden und (iv) Evidenz und Theorien in fortlaufend dezentral aktualisierten und miteinander verknüpften Theorie- und Empirie-Datenbanken zusammengeführt werden.


Sound and Efficient Science: A Strategic Alignment of Research Processes as Way out of the Replication Crisis

Abstract. The replication crisis in psychology has led to a fruitful discussion about common research practices and research institutions. We present a set of measures that aim at making science more efficient and research results more reliable by fostering a strategic alignment and the interlocking of all parts of the research process. The recommended changes address individuals as well as institutions and concern theory, empirical methodology, and accumulation of evidence. Beyond the by-now well-established and rather specific measures to improve reproducibility, the ideas put forward in this paper aim to improve the foundation for efficient research by fostering: (a) precise theory specification, critical theory testing, and theory revision; (b) a culture of transparency and acceptance of mistakes as well as openness to evidence and subsequent theory revision; (c) an integration of this culture and the respective methodology into academic education; and (d) the establishment of interconnected databases for theories and empirical results, which are continuously updated in a decentralized manner.

Literatur

  • Alogna, V. K., Attaya, M. K., Aucoin, P., Bahník, Š., Birch, S. & Birt, A. R., et al. (2014). Registered replication report: Schooler and Engstler-Schooler (1990). Perspectives on Psychological Science, 9, 556 – 578. First citation in articleCrossrefGoogle Scholar

  • Asendorpf, J. B., Conner, M., De Fruyt, F., De Houwer, J., Denissen, J. J. & Fiedler, K. (2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27 (2), 108 – 119. https://doi.org/10.1002/per.1919 First citation in articleCrossrefGoogle Scholar

  • Assen, M. A. van, Aert, R. van & Wicherts, J. M. (2015). Meta-analysis using effect size distributions of only statistically significant studies. Psychological Methods, 20, 293. https://doi.org/10.1037/met0000025 First citation in articleCrossrefGoogle Scholar

  • Bakker, M., Dijk, A. van & Wicherts, J. M. (2012). The Rules of the Game Called Psychological Science. Perspectives on Psychological Science, 7, 543 – 554. https://doi.org/10.1177/1745691612459060 First citation in articleCrossrefGoogle Scholar

  • Baumeister, R. F. (in press). Charting the future of social psychology on stormy seas: Winners, losers, and recommendations. Journal of Experimental Social Psychology. https://doi.org/10.1016/j.jesp.2016.02.003 First citation in articleGoogle Scholar

  • Baumeister, R. F., Bratslavsky, E., Muraven, M. & Tice, D. M. (1998). Ego depletion: Is the active self a limited resource? Journal of Personality and Social Psychology, 74, 1252. https://doi.org/10.1037/0022-3514.74.5.1252 First citation in articleCrossrefGoogle Scholar

  • Bosco, F. A., Aguinis, H., Field, J. G., Pierce, C. A. & Dalton, D. R. (2015). Harking’s Threat to Organizational Research: Evidence From Primary and Meta‐Analytic Sources. Personnel psychology. https://doi.org/10.1111/peps.12111 First citation in articleGoogle Scholar

  • Button, K. S., Ioannidis, J. P. A., Mokrysz, C., Nosek, B. A., Flint, J. & Robinson, E. S. J. (2013). Power failure: Why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci, 14, 365 – 376. https://doi.org/10.1038/nrn3475 First citation in articleCrossrefGoogle Scholar

  • Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J. & Johannesson, M. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6277). https://doi.org/10.1126/science.aaf0918 First citation in articleCrossrefGoogle Scholar

  • Cohen, J. (1994). The Earth is round (p < .05). American Psychologist, 49, 997 – 1003. https://doi.org/10.1037/0003-066x.49.12.997 First citation in articleCrossrefGoogle Scholar

  • Dalton, D. R., Aguinis, H., Dalton, C. M., Bosco, F. A. & Pierce, C. A. (2012). Revisiting the file drawer problem in meta-analysis: An assessment of published and nonpublished correlation matrices. Personnel psychology, 65, 221 – 249. https://doi.org/10.1111/j.1744-6570.2012.01243.x First citation in articleCrossrefGoogle Scholar

  • Dawes, R. M. (1980). Social dilemmas. Annual Review of Psychology, 31, 169 – 193. https://doi.org/10.1146/annurev.ps.31.020180.001125 First citation in articleCrossrefGoogle Scholar

  • Dienes, Z. (2008). Understanding Psychology as a Science: An Introduction to Scientific and Statistical Inference. Hampshire, UK: Palgrave Macmillan. First citation in articleGoogle Scholar

  • Dijksterhuis, A., Bos, M. W., Nordgren, L. F. & Van Baaren, R. B. (2006). On making the right choice: The deliberation-without-attention effect. Science, 311, 1005 – 1007. https://doi.org/10.1126/science.1121629 First citation in articleCrossrefGoogle Scholar

  • Eerland, A., Sherrill, A., Magliano, J., Zwaan, R., Eerland, A. & Arnal, J. D. (2016). Registered replication report: Hart & Albarracín (2011). Perspectives on Psychological Science, 11, 158 – 171. https://doi.org/10.1177/1745691615605826 First citation in articleCrossrefGoogle Scholar

  • Engel, C. (2015). Scientific Disintegrity as a Public Bad. Perspectives on Psychological Science, 10, 361 – 379. https://doi.org/10.1177/1745691615577865 First citation in articleCrossrefGoogle Scholar

  • Etz, A. & Vandekerckhove, J. (2016). A Bayesian Perspective on the Reproducibility Project: Psychology. PLoS One, 11, https://doi.org/10.1371/journal.pone.0149794. First citation in articleCrossrefGoogle Scholar

  • Fanelli, D. (2012). Negative results are disappearing from most disciplines and countries. Scientometrics, 1 – 14. https://doi.org/10.1007/s11192-011-0494-7 First citation in articleGoogle Scholar

  • Fecher, B., Fräßdorf, M. & Wagner, G. G. (2016). Perceptions and Practices of Replication by Social and Behavioral Scientists: Making Replications a Mandatory Element of Curricula Would Be Useful. (DIW Berlin Nr. 1572). Berlin: Deutsches Institut für Wirtschaftsforschung. First citation in articleGoogle Scholar

  • Ferguson, C. J. & Heene, M. (2012). A vast graveyard of undead theories: Publication bias and psychological scienceˈs aversion to the null. Perspectives on Psychological Science, 7, 555 – 561. First citation in articleCrossrefGoogle Scholar

  • Francis, G. (2013). Replication, statistical consistency, and publication bias. Journal of Mathematical Psychology, 57, 153 – 169. https://doi.org/10.1016/j.jmp.2013.02.003 First citation in articleCrossrefGoogle Scholar

  • Fiedler, K. (2016). Empfehlungen der DGPs-Kommission „Qualität der psychologischen Forschung “. Psychologische Rundschau, 67 (1), 59 – 74. First citation in articleLinkGoogle Scholar

  • Finkel, E. J., Rusbult, C. E., Kumashiro, M. & Hannon, P. A. (2002). Dealing with betrayal in close relationships: Does commitment promote forgiveness? Journal of Personality and Social Psychology, 82, 956 – 974. First citation in articleCrossrefGoogle Scholar

  • Frank, M. (2014). Shifting our cultural understanding of replication. Verfügbar unter http://babieslearninglanguage.blogspot.de/2014/06/shifting-our-cultural-understanding-of.html First citation in articleGoogle Scholar

  • Fraley, R. C. & Vazire, S. (2014). The N-pact factor: Evaluating the quality of empirical journals with respect to sample size and statistical power. PLoS One, 9 (10), https://doi.org/10.1371/journal.pone.0109019 First citation in articleCrossrefGoogle Scholar

  • Gilbert, D. T., King, G., Pettigrew, S. & Wilson, T. D. (2016). Comment on „Estimating the reproducibility of psychological science“. Science, 351 (6277), 1037-b. https://doi.org/10.7910/DVN/5LKVH2 First citation in articleCrossrefGoogle Scholar

  • Glöckner, A. & Betsch, T. (2011). The empirical content of theories in judgment and decision making: Shortcomings and remedies. Judgment and decision making, 6, 711 – 721. First citation in articleGoogle Scholar

  • Hagger, M. S., Chatzisarantis, N. L., Alberts, H., Anggono, C. O., Batailler, C. & Birt, A. R., et al. (2016). A multilab preregistered replication of the egodepletion effect. Perspectives on Psychological Science. 11, 546 – 573. First citation in articleCrossrefGoogle Scholar

  • Huizenga, H. M., Wetzels, R., van Ravenzwaaij, D. & Wagenmakers, E.-J. (2012). Four empirical tests of unconscious thought theory. Organizational Behavior and Human Decision Processes, 117, 332 – 340. https://doi.org/10.1016/j.obhdp.2011.11.010 First citation in articleCrossrefGoogle Scholar

  • Ioannidis, J. & Trikalinos, T. A. (2007). An exploratory test for an excess of significant findings. Clinical Trials, 4, 245. https://doi.org/10.1177/1740774507079441 First citation in articleCrossrefGoogle Scholar

  • Ioannidis, J. P. A. (2005). Why most published research findings are false. PLoS Medicine, 2, 696. https://doi.org/10.1371/journal.pmed.0020124. First citation in articleCrossrefGoogle Scholar

  • Kass, R. E. & Raftery, A. E. (1995). Bayes factors. Journal of the American Statistical Association, 90, 773 – 795. https://doi.org/10.1080/01621459.1995.10476572 First citation in articleCrossrefGoogle Scholar

  • Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2, 196 – 217. First citation in articleCrossrefGoogle Scholar

  • Kidwell, M. C., Lazarevic, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S. & Falkenberg, L.-S. (2016). Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS biology, 14 (5), e1002456. https://doi.org/10.1371/journal.pbio.1002456 First citation in articleCrossrefGoogle Scholar

  • Klein, R., Ratliff, K. A., Vianello, M., Reginald B. Adams, J. & Bahník, Š. (2014). Investigating variation in replicability: A “Many Labs” Replication Project. Social Psychology, 45, 142 – 152. https://doi.org/10.1027/1864-9335/a000178 First citation in articleLinkGoogle Scholar

  • Kuhn, T. S. (1962). The structure of scientic revolutions. Chicago: University of Chicago Press. First citation in articleGoogle Scholar

  • Lakatos, I. (1970). Falsification and the methodology of scientific research programmes. In I. LakatosA. MusgraveEds., Criticism and the growth of knowledge (pp. 91 – 196). Cambridge, UK: Cambridge University Press. First citation in articleGoogle Scholar

  • Lakatos, I. & Musgrave, A. (1970). Criticism and the Growth of Knowledge. Cambridge, UK: Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Lamal, P. (1990). On the importance of replication. Journal of Social Behavior and Personality, 5 (4), 31. First citation in articleGoogle Scholar

  • Lane, A., Luminet, O., Nave, G. & Mikolajczak, M. (2016). Is there a publication bias in behavioral intranasal oxytocin research on humans? Opening the file drawer of one lab. Journal of Neuroendocrinology: from molecular to translational neurobiology, 28, 1365 – 2826. https://doi.org/10.1111/jne.12384 First citation in articleGoogle Scholar

  • Makel, M. C., Plucker, J. A. & Hegarty, B. (2012). Replications in Psychology Research How Often Do They Really Occur? Perspectives on Psychological Science, 7, 537 – 542. https://doi.org/10.1177/1745691612460688 First citation in articleCrossrefGoogle Scholar

  • Meehl, P. E. (1967). Theory-testing in psychology and physics: A methodological paradox. Philosophy of Science, 34, 103 – 115. First citation in articleCrossrefGoogle Scholar

  • Meehl, P. E. (1978). Theoretical Risks and Tabular Asterisks: Sir Karl, Sir Ronald, and the Slow Progress of Soft Psychology. Journal of Consulting and Clinical Psychology, 46, 806 – 834. First citation in articleCrossrefGoogle Scholar

  • Nosek, B. A., Spies, J. R. & Motyl, M. (2012). Scientific utopia II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615 – 631. https://doi.org/10.1177/1745691612459058 First citation in articleCrossrefGoogle Scholar

  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349 (6251), aac4716. https://doi.org/10.1126/science.aac4716. First citation in articleCrossrefGoogle Scholar

  • Pashler, H. & Harris, C. R. (2012). Is the replicability crisis overblown? Three arguments examined. Perspectives on Psychological Science, 7, 531 – 536. https://doi.org/10.1177/1745691612463401 First citation in articleCrossrefGoogle Scholar

  • Platt, J. R. (1964). Strong inference. Science, 146, 347 – 353. First citation in articleCrossrefGoogle Scholar

  • Popper, K. R. (1934). Logik der Forschung. [The logic of scientific discovery.]. London, UK: Hutchinson. First citation in articleGoogle Scholar

  • Popper, K. R. (2005): Die Welt des Parmenides. Der Ursprung des europäischen Denkens. München: Piper First citation in articleGoogle Scholar

  • Rand, D. G., Greene, J. D. & Nowak, M. A. (2012). Spontaneous giving and calculated greed. Nature, 489, 427 – 430. First citation in articleCrossrefGoogle Scholar

  • Renkewitz, F. & Keiner, M. (2017). How to detect and correct for bias in the psychological literature? A comparative evaluation of 5 methods. Manuscript in preparation. (Data available at osf.io/3p65a) First citation in articleGoogle Scholar

  • Rosenthal, R. (1979). The „file drawer problem“ and tolerance for null results. Psychological Bulletin, 86, 638 – 641. https://doi.org/10.1037/0033-2909.86.3.638 First citation in articleCrossrefGoogle Scholar

  • Sakaluk, J. K. (in press). Exploring small, confirming big: An alternative system to the new statistics for advancing cumulative and replicable psychological research. Journal of Experimental Social Psychology. https://doi.org/10.1016/j.jesp.2015.09.013 First citation in articleGoogle Scholar

  • Schimmack, U. & Heene, M. (20. 04. 2016). Die Verdrängung des selektiven Publizierens: 7 Fallstudien von prominenten Sozialpsychologen. Verfügbar unter https://replicationindex.wordpress.com/2016/04/20/die-verdrangung-des-selektiven-publizierens-7-fallstudien-von-prominenten-sozialpsychologen/ First citation in articleGoogle Scholar

  • Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13 (2), 90. https://doi.org/10.1037/a0015108 First citation in articleCrossrefGoogle Scholar

  • Schweizer, G. & Furley, P. (2016). Reproducible research in sport and exercise psychology: The role of sample sizes. Psychology of Sport and Exercise, 23, 114 – 122. https://doi.org/10.1016/j.psychsport.2015.11.005 First citation in articleCrossrefGoogle Scholar

  • Shaver, J. P. & Norton, R. S. (1980). Randomness and Replication in Ten Years of the „American Educational Research Journal“. Educational Researcher, 9 (1), 9 – 15. https://doi.org/10.3102/00028312012002109 First citation in articleGoogle Scholar

  • Simmons, J. P., Nelson, L. D. & Simonsohn, U. (2011). False-positive psychology undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359 – 1366. https://doi.org/10.1177/0956797611417632 First citation in articleCrossrefGoogle Scholar

  • Simonsohn, U., Nelson, L. D. & Simmons, J. P. (2014). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143, 534. https://doi.org/10.1037/a0033242 First citation in articleCrossrefGoogle Scholar

  • Smith, N. C. (1970). Replication studies: A neglected aspect of psychological research. American Psychologist, 25, 970. https://doi.org/10.1037/h0026434 First citation in articleCrossrefGoogle Scholar

  • Strack, F., Martin, L. L. & Stepper, S. (1988). Inhibiting and facilitating conditions of the human smile: a nonobtrusive test of the facial feedback hypothesis. Journal of Personality and Social Psychology, 54, 768 – 777. First citation in articleCrossrefGoogle Scholar

  • Stroebe, W. & Strack, F. (2014). The alleged crisis and the illusion of exact replication. Perspectives on Psychological Science, 9 (1), 59 – 71. https://doi.org/10.1177/1745691613514450 First citation in articleCrossrefGoogle Scholar

  • Vadillo, M. A., Konstantinidis, E. & Shanks, D. R. (2015). Underpowered samples, false negatives, and unconscious learning. Psychonomic Bulletin & Review, 23 (1), 87 – 102. https://doi.org/ 10.3758/s13423-015-0892-6 First citation in articleCrossrefGoogle Scholar

  • Vanpaemel, W., Vermorgen, M., Deriemaecker, L. & Storms, G. (2015). Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra, 1 (1). http://doi.org/10.1525/collabra.13 First citation in articleCrossrefGoogle Scholar

  • Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12, 129 – 140. First citation in articleCrossrefGoogle Scholar

  • Wicherts, J. M., Borsboom, D., Kats, J. & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American Psychologist, 61, 726. https://doi.org/10.1037/0003-066X.61.7.726 First citation in articleCrossrefGoogle Scholar

  • Wilkinson, L. & Task Force on Statistical Inference. (1999). Statistical methods in psychology journals: Guidelines and explanations. American Psychologist, 54, 594 – 604. https://doi.org/10.1037/0003-066X.54.8.594 First citation in articleCrossrefGoogle Scholar

  • Yu, E. C., Sprenger, A. M., Thomas, R. P. & Dougherty, M. R. (2014). When decision heuristics and science collide. Psychonomic Bulletin & Review, 21, 268 – 282. https://doi.org/10.3758/s13423-013-0495-z First citation in articleCrossrefGoogle Scholar