Skip to main content
Free AccessEditorial

Hotspots in Psychology

A New Format for Special Issues of the Zeitschrift für Psychologie

Published Online:https://doi.org/10.1027/2151-2604/a000249

The Zeitschrift für Psychologie, originally founded as Zeitschrift für Psychologie und Physiologie der Sinnesorgane in 1890, was a German-language psychology journal rich in tradition for more than a century before it became an English-language journal in 2007 (Holling & Erdfelder, 2007). This change in language was accompanied by a change in format: All issues of the Zeitschrift für Psychologie (ZfP) that appeared since 2007 were topical issues focusing on a certain theme of psychological science from one of its subdisciplines, including all basic and applied fields of psychology as well as quantitative psychological methods. Starting with this Issue 3, 2016, the editors will extend the scope of the ZfP by introducing a new format entitled “Hotspots in Psychology.” While the topical issues will be continued in the future, the “Hotspots in Psychology” format to appear once annually is devoted to systematic reviews and meta-analyses in research-active (i.e., hotspot) fields that have generated a considerable number of primary studies. Thus, the common denominator of future “hotspots” issues is the research synthesis nature of the articles included, not a specific psychological topic or theme that all articles have to address.

Why do we need an additional format for the ZfP special issues? One could argue that contributions making use of research synthesis methods have always been a prominent part of topical issues and thus have been already well represented in the ZfP since 2007. In fact, typically at least one of the articles per ZfP issue encompassed a review in past years. For example, Ogden and Fixsen (2014) reviewed the emerging field of implementation research, Küpper-Tetzel (2014) recent results on the distributed-practice benefit in human memory, Kirsch (2014) the role of placebo effects in antidepressants, and Pundt (2014) the literature on possible links between charismatic and destructive leadership. More recently, Blömeke, Gustafsson, and Shavelson (2015) summarized research and controversies on the assessment of competencies in higher education, Kuhn (2015) the interdisciplinary research field of developmental dyscalculia, and Steindl, Jonas, Sittenthaler, Traut-Mattausch, and Greenberg (2015) recent research trends and results on psychological reactance. Notably, this is only a selection of the reviews that appeared in the past few years. So why is there a need for even more reviews?

The inclusion of the “hotspots” format was mainly motivated by four reasons. First and most importantly, there is a growing awareness in our discipline that at least parts of psychology suffer from a “replicability crisis,” that is, a large number of false positive results in the published literature (see, e.g., Eklund, Nichols, & Knutsson, 2016; Open Science Collaboration, 2015; Ulrich et al., 2016). Thus, there is a need for methodologically sound research synthesis approaches that allow us to discriminate between true effects and false positives in various branches of psychology.

Second, most of the reviews that appeared in ZfP not only summarized recent research results but also provided conceptual introductions and integrated the literature in theoretical frameworks proposed by the authors. In other words, these papers were designed as conceptual papers to promote certain theoretical positions, that is, as conceptual and theoretical reviews in the first place. Although such contributions are definitely helpful and necessary in the context of topical issues, they cannot replace quantitative research syntheses required to answer the question whether certain hypothesized effects actually exist, how large they are, or whether certain psychological treatments are effective or not. In particular, none of the recent ZfP reviews – like the vast majority of all other reviews that previously appeared in ZfP – can count as a systematic review that specifies the selection rules and the analysis criteria for the reviewed literature explicitly and on a priori grounds. Thus, despite the regular inclusion of conceptual and theoretical reviews in topical issues, there is a lack of systematic reviews that take most recent approaches to conduct and to report such studies into account.

Third, an emerging field of research synthesis approaches, statistical meta-analysis, was not well represented in the ZfP, the single exceptions being the topical issue on meta-analysis that appeared almost a decade ago (Schulze, 2007), and one recent meta-analysis on intervention effects for children with math difficulties (Chodura, Kuhn, & Holling, 2015). Modern statistical meta-analysis techniques have become particularly important in the context of the replicability crisis since such analyses provide precise quantitative estimates of overall effect sizes and their variability across studies. Moreover, elaborated meta-analytic techniques also allow us at least partly to “decontaminate” aggregated effect sizes from distortions by publication bias, that is, selective publication of significant results and suppression of insignificant ones (see, e.g., Rothstein, Sutton, & Borenstein, 2005, for an overview), or by correcting for specific study imperfections such as effect size attenuations due to reliability and validity restrictions (Schmidt & Hunter, 2014).

Fourth, in Europe at least, there is a lack of scientific psychology journals that publish systematic reviews and meta-analyses from all parts of psychology on a regular basis. We believe that the ZfP with its broad perspective on both basic and applied psychological research would be an ideal outlet for closing this gap.

Taken together, these four reasons suggest to extend the scope of the ZfP by introducing a new special issue format that is devoted to systematic reviews and quantitative meta-analyses from all branches of psychology exclusively. This is the core idea of the “Hotspots in Psychology” issues, and we hope they will attract attention and be influential in the future.

In our call for papers for the first “hotspots” issue, we emphasized that

“ideally, research synthesis approaches are transparent procedures, to find, evaluate, and aggregate the results of relevant research. Procedures are explicitly defined in advance to ensure that the exercise is transparent and replicable. This practice is designed to minimize bias and increase the trustworthiness of findings” (Erdfelder & Bošnjak, 2015, p. 146).

Contributors were therefore advised to adhere to the meta-analytic reporting standards recommended by APA (APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008). We invited four types of papers, namely (1) systematic reviews on currently debated psychological topics, (2) meta-analyses on hotspot topics both in basic and applied fields of psychology, (3) meta-analytic replications and extension of previously published research syntheses, and (4) papers on methodological advances in research synthesis methodology. Thus, in addition to systematic reviews and meta-analyses on substantive psychological topics, contributions to methodological innovations and discussions of modern research synthesis approaches were also highly welcome.

The six papers that were accepted for publication in our first “hotspots” issue relate to all of these areas. The first two articles address methodological aspects of research synthesis approaches. Specifically, Kühberger, Scherndl, Ludwig, and Simon (2016) evaluated one type of nonsystematic review, the so-called narrative review, by comparing it with meta-analytic results on the same topic. In a case study, they performed a meta-analysis on all studies covered by a narrative review of the behavioral priming literature (i.e., Bargh, Schwader, Hailey, Dyer, & Boothby, 2012) and observed much larger effect sizes than typically found in thoroughly conducted meta-analyses of behavioral priming research. The authors conclude that narrative reviews may “(…) run the risk of drawing a picture that tends to be too good to be true (…),” thus overestimating the size of the effect of interest. This strong conclusion of course awaits validation by additional evaluations of narrative reviews, and we encourage authors to conduct such work. The second methodological paper by Kaufmann, Reips, and Merki (2016) discusses so-called individual participant data (IPD) meta-analyses and applies them in the context of educational psychology. IPD meta-analyses can and should be performed whenever the full data sets of participants contributing to a set of studies are available, not just aggregate statistics such as means, standard deviations, or effect sizes that typically enter into meta-analyses. It goes without saying that IPD meta-analyses are powerful alternatives to standard meta-analyses based on aggregate statistics, provided they are analyzed appropriately. As emphasized by Kaufmann et al. (2016), IPD meta-analyses certainly will be a field of growing importance in the future.

The second part of the current issue includes research syntheses in basic and applied fields of psychology, more specifically, social psychology and applied cognitive psychology. Rennung and Göritz (2016) report a meta-analysis on effects of interpersonal synchrony, that is, effects of movements or sensations that overlap in time and form between different individuals. Their results indicate that interpersonal synchrony experiences tend to foster prosocial behavior in general, albeit moderated by intentionality and possible experimenter effects. Furthermore, in an update of previous research syntheses, Giroux, Coburn, Harley, Connolly, and Bernstein (2016) present a systematic review of hindsight bias effects in the legal system. Hindsight bias is the tendency to overestimate the foreseeability of events in retrospect, that is, after the outcome is known. Giroux et al. (2016) identified five law-related fields in which hindsight bias has been shown to distort judgments. They also discuss possible ways to overcome or at least reduce hindsight bias effects in the legal system.

The third and final part of the current issue includes two research syntheses addressing psychological assessment and intervention research. Lazarević et al. (2016) provide a meta-analysis on correlations between the dimensions of the Psychobiological Model of Personality (PBMP; Cloninger, Przybeck, & Svrakic, 1991) and desintegration-like psychotic symptoms. They show that these correlations generally do not exceed .25. This suggests that desintegration-like symptoms are not adequately captured by the PBMP. The authors conclude that the PBMP requires a desintegration measure as an extension to provide a full assessment of clinically relevant personality aspects. Last but not least, Steinmetz, Knappstein, Ajzen, Schmidt, and Kabst (2016) present a multilevel meta-analysis on the effectiveness of behavior change interventions based on the theory of planned behavior (TPB; Ajzen, 2012). Their results suggest that the TPB in fact provides a successful framework for behavior interventions.

We very much hope that the contributions to this first “hotspots” issue stimulate further research and contribute to scientific discussions. If readers disagree with the conclusions of one or several of the contributions, we explicitly invite critical reassessments based on state-of-the-art research synthesis methods. We encourage these colleagues to submit their critical reassessments to ZfP. In any case, future “hotspots” issues will be open for systematic reviews and meta-analyses in basic and applied fields of psychology as well as psychological assessment and intervention research. To reiterate, the same holds for innovative contributions to and discussions of the methodology of research syntheses within psychology.

References

  • Ajzen, I. (2012). The theory of planned behavior. In P. LangeA. KruglanskiE. T. HigginsEds., Handbook of theories of social psychology (pp. 438–459). London, UK: Sage. First citation in articleGoogle Scholar

  • APA Publications and Communications Board Working Group on Journal Article Reporting Standards. (2008). Reporting standards for research in psychology. Why do we need them? What might they be? American Psychologist, 63, 839–851. doi: 10.1037/0003-066X.63.9.839 First citation in articleCrossrefGoogle Scholar

  • Bargh, J. A., Schwader, K. L., Hailey, S. E., Dyer, R. L. & Boothby, E. J. (2012). Automaticity in social-cognitive processes. Trends in Cognitive Sciences, 16, 593–605. doi: 10.1016/j.tics.2012.10.002 First citation in articleCrossrefGoogle Scholar

  • Blömeke, S., Gustafsson, J.-E. & Shavelson, R. J. (2015). Beyond dichotomies. Competence viewed as a continuum. Zeitschrift für Psychologie, 223, 3–13. doi: 10.1027/2151-2604/a000194 First citation in articleLinkGoogle Scholar

  • Chodura, S., Kuhn, J.-T. & Holling, H. (2015). Interventions for children with mathematical difficulties. A meta-analysis. Zeitschrift für Psychologie, 223, 129–144. doi: 1027/2151-2604/a000211 First citation in articleLinkGoogle Scholar

  • Cloninger, C. R., Przybeck, T. R. & Svrakic, D. M. (1991). The tridimensional personality questionnaire: US normative data. Psychological Reports, 69, 1047–1057. doi: 10.2466/pr0.1991.69.3.1047 First citation in articleCrossrefGoogle Scholar

  • Eklund, A., Nichols, T. E. & Knutsson, H. (2016). Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates. Proceedings of the National Academy of Sciences, 113, 7900–7905. doi: 10.1073/pnas.1602413113 First citation in articleCrossrefGoogle Scholar

  • Erdfelder, E. & Bošnjak, M. (2015). “Hotspots in Psychology”: A special issue of the Zeitschrift für Psychologie. Zeitschrift für Psychologie, 223, 146–147. doi: 10.1027/2151-2604/a000212 First citation in articleLinkGoogle Scholar

  • Giroux, M. E., Coburn, P. I., Harley, E. M., Connolly, D. A. & Bernstein, D. M. (2016). Hindsight bias and law. Zeitschrift für Psychologie, 224, 190–203. doi: 10.1027/2151-2604/a000253 First citation in articleLinkGoogle Scholar

  • Holling, H. & Erdfelder, E. (2007). The New Zeitschrift für Psychologie/Journal of Psychology. Zeitschrift für Psychologie, 215, 1–3. doi: 10.1027/0044-3409.215.1.1 First citation in articleLinkGoogle Scholar

  • Kaufmann, E., Reips, U.-D. & Merki, K. M. (2016). Use of offline versus online individual participant data (IPD) meta-analysis in educational psychology. Zeitschrift für Psychologie, 224, 157–167. doi: 10.1027/2151-2604/a000251 First citation in articleLinkGoogle Scholar

  • Kirsch, I. (2014). Antipressants and the placebo effect. Zeitschrift für Psychologie, 222, 128–134. doi: 10.1027/2151-2604/a000176 First citation in articleLinkGoogle Scholar

  • Kühberger, A., Scherndl, T., Ludwig, B. & Simon, D. M. (2016). Comparative evaluation of narrative reviews and meta-analyses: A case study. Zeitschrift für Psychologie, 224, 145–156. doi: 10.1027/2151-2604/a000250 First citation in articleLinkGoogle Scholar

  • Küpper-Tetzel, C. E. (2014). Understanding the distributed practice effect. Strong effects on weak theoretical grounds. Zeitschrift für Psychologie, 222, 71–88. doi: 10.1027/2151-2604/a000168 First citation in articleLinkGoogle Scholar

  • Kuhn, T. (2015). Developmental dyscalculia. Neurobiological, cognitive, and developmental perspectives. Zeitschrift für Psychologie, 223, 69–82. doi: 10.1027/2151-2604/a000205 First citation in articleLinkGoogle Scholar

  • Lazarević, L. B., Bošnjak, M., Knežević, G., Petrović, B., Purić, D., Teovanović, P., … Bodroža, B. (2016). Disintegration as an additional trait in the psychobiological model of personality: Assessing discriminant validity via meta-analysis. Zeitschrift für Psychologie, 224, 204–215. doi: 10.1027/2151-2604/a000254 First citation in articleLinkGoogle Scholar

  • Ogden, T. & Fixsen, D. L. (2014). Implementation science. A brief overview and a look ahead. Zeitschrift für Psychologie, 222, 4–11. doi: 10.1027/2151-2604/a000160 First citation in articleLinkGoogle Scholar

  • Open Science Collaboration. (2015). Estimating the reproducibility of Psychological Science. Science, 349, aac4716. doi: 10.1126/science.aac4716 First citation in articleCrossrefGoogle Scholar

  • Pundt, A. (2014). A multiple pathway model linking charismatic leadership attempts and abusive supervision. Zeitschrift für Psychologie, 222, 190–202. doi: 10.1027/2151-2604/a000186 First citation in articleLinkGoogle Scholar

  • Rennung, M. & Göritz, A. S. (2016). Prosocial consequences of interpersonal synchrony: A meta-analysis. Zeitschrift für Psychologie, 224, 168–189. doi: 10.1027/2151-2604/a000252 First citation in articleLinkGoogle Scholar

  • Rothstein, H. R., Sutton, A. J. & Borenstein, M. (2005). Publication bias in meta-analysis: Prevention, assessment and adjustments. Chichester, UK: Wiley. First citation in articleCrossrefGoogle Scholar

  • Schmidt, F. L. & Hunter, J. E. (2014). Methods of meta-analysis: Correcting error and bias in research findings. Thousand Oaks, CA: Sage. First citation in articleGoogle Scholar

  • Schulze, R. (2007). The state and the art of meta-analysis. Zeitschrift für Psychologie, 215, 87–89. doi: 10.1027/0044-3409.215.2.87 First citation in articleLinkGoogle Scholar

  • Steindl, C., Jonas, E., Sittenthaler, S., Traut-Mattausch, E. & Greenberg, J. (2015). Understanding psychological reactance. New developments and findings. Zeitschrift für Psychologie, 223, 205–214. doi: 10.1027/2151-2604/a000222 First citation in articleLinkGoogle Scholar

  • Steinmetz, H., Knappstein, M., Ajzen, I., Schmidt, P. & Kabst, R. (2016). How effective are behavior change interventions based on the theory of planned behavior? A three-level meta-analysis. Zeitschrift für Psychologie, 224, 216–233. doi: 10.1027/2151-2604/a000255 First citation in articleLinkGoogle Scholar

  • Ulrich, R., Erdfelder, E., Deutsch, R., Strauß, B., Brüggemann, A., Hannover, B., … Rief, W. (2016). Inflation von falsch-positiven Befunden in der psychologischen Forschung: Mögliche Ursachen und Gegenmaßnahmen.[Inflation of false positive results in psychological research: Possible causes and countermeasures] Psychologische Rundschau, 67, 163–174. doi: 10.1026/0033-3042/a000296 First citation in articleLinkGoogle Scholar

Edgar Erdfelder, Cognition and Individual Differences Lab, University of Mannheim, 68131 Mannheim, Germany,
Michael Bošnjak, Department of Survey Design and Methodology, GESIS – Leibniz Institute for the Social Sciences, B 2, 1, 68159 Mannheim, Germany,