Skip to main content
Open AccessEditorial

Hotspots in Psychology – 2023 Edition

Published Online:https://doi.org/10.1027/2151-2604/a000508

The aim of the seventh edition of the Hotspots in Psychology series is to address hotspots topics in psychology and related areas with the help of research synthesis and Big Data methods for large-scale applications. Above this, methodological advances in research synthesis are presented that are relevant for any subfield of psychology.

Substantive issues using large-scale data or meta-analysis are addressed by the following two papers:

An overview on a practical aspect relevant for each research synthesis is given by Burgard and Bittermann (2023, this issue). The systematic review introduces literature screening tools that allow reducing the workload in research syntheses. The tools have in common that machine learning and natural language processing are used to semiautomate decision-making in the screening process. The evidence on the performance of screening automation tools known from 21 previous applications is summarized. In comparison to sampling records randomly, active screening with prioritization approximately halves the screening workload.

Seekircher and colleagues (2023, this issue) investigate the effects of meditation interventions on stress and well-being in RCTs of clinical populations. In line with a previous meta-analysis on the topic, the authors find that participants randomized to meditation interventions showed lower levels of perceived stress. Above this and in contrast to the former study, the meta-analysis reveals higher levels of subjective well-being for participants of meditation interventions compared to active comparison groups. Overall, the meta-analysis provides evidence for the benefit of using meditation interventions as a complementary treatment for chronically ill patients.

Methodological advances in the area of research synthesis methods are the focus of the following four papers:

A valuable data source for many big data applications are web data, as social media or trace data. Whether the conclusions resulting from these sources are comparable to traditional survey data, is tested in four studies presented by Speckmann and Wingen (2023, this issue). Overall, the researchers find that for most research questions, conclusions are independent of the data type. Using web data is thus unlikely to cause considerable problems regarding research synthesis or replicability. Instead, web data has enormous potential for answering psychological research questions, as it leads to similar findings as traditional methods while providing many additional benefits.

For many applications of meta-analytic structural equation modeling (MASEM), latent constructs are used. Potential measurement errors could thus distort the effect estimates if the unreliability of study variables is not acknowledged. In the simulation study of Gnambs and Sengewald (2023, this issue), the impact of measurement error on MASEM results is explored using different types of mediation models. The simulation results indicate that MASEMs with fallible measurements often yield biased estimates. Adjustments for attenuation should therefore be adopted in MASEM regularly.

An application with large-scale panel data in educational psychology is presented by Kiefer and colleagues (2023, this issue). In the study, the development of a student's commitment to their study program over the course of 3 years is examined. The trajectories are compared for interdisciplinary, monodisciplinary, and multidisciplinary programs. Above this, the moderating effect of interest profiles on trajectories of study success is investigated using exceptional model mining. Whereas, in traditional two-step approaches, exceptional structural relations may be missed because clustering is independent from the moderator analysis, exceptional model mining has the potential to identify interesting combinations of covariates and thus provides a novel framework to model structural heterogeneity.

Van Assen and colleagues (2023, this issue) present a new descriptive visual tool for meta-analyses: the metaplot. It provides information on the precision and statistical power of the underlying primary studies and is designed to provide some indication of whether (and how strongly) the meta-analytic estimate may be affected by publication bias. The metaplot is applied to 12 meta-analyses in the field of psychology and recommended as a superior alternative to the funnel plot.

Finally, we hope that the contributions to this Hotspots in Psychology issue stimulate further research and contribute to new and ongoing scientific discussions in the field. We would like to announce the upcoming 4th Symposium on Big Data and Research Syntheses in Psychology to be held May 8–10, 2023, in Frankfurt (Germany): http://ressyn-bigdata.org/.

We would also like to point readers to the comprehensive sets of supplemental material facilitating reproduction and replication in PsychArchives: https://tinyurl.com/2p8z2m8x.

References

  • Burgard, T., & Bittermann, A. (2023). Reducing literature screening workload with machine learning: Systematic review of tools and their performance. Zeitschrift für Psychologie, 231(1), 3–15. 10.1027/2151-2604/a000509 First citation in articleLinkGoogle Scholar

  • Gnambs, T., & Sengewald, M.-A. (2023). Meta-analytic structural equation modeling with fallible measurements. Zeitschrift für Psychologie, 231(1), 39–52. 10.1027/2151-2604/a000511 First citation in articleLinkGoogle Scholar

  • Kiefer, C., Claus, A., Jung, A., Wiese, B., & Mayer, A. (2023). Discovering exceptional development of commitment in interdisciplinary study programs: An illustration of the SubgroupSEM approach. Zeitschrift für Psychologie, 231(1), 53–64. 10.1027/2151-2604/a000512 First citation in articleLinkGoogle Scholar

  • Seekircher, J., Burgard, T., & Bosnjak, M. (2023). The effects of clinical meditation programs on stress and well-being: An updated rapid review and meta-analysis of randomized controlled trials (RCTs) with active comparison groups. Zeitschrift für Psychologie, 231(1), 16–29. 10.1027/2151-2604/a000510 First citation in articleLinkGoogle Scholar

  • Speckmann, F., & Wingen, T. (2023). Same question, different answers? An empirical comparison of web data and traditional data. Zeitschrift für Psychologie, 231(1), 30–38. 10.1027/2151-2604/a000515 First citation in articleLinkGoogle Scholar

  • Van Assen, M. A. L. M., van den Akker, O. R., Augusteijn, H. E. M., Bakker, M., Nuijten, M. B., Olsson-Collentine, A., Stoevenbelt, A. H., Wicherts, J. M., & van Aert, R. C. M. (2023). The meta-plot: A graphical tool for interpreting the results of a meta-analysis. Zeitschrift für Psychologie, 231(1), 65–78. 10.1027/2151-2604/a000513 First citation in articleLinkGoogle Scholar