Skip to main content
Open AccessEditorial

8th Congress of the European Association of Methodology (EAM)

Published Online:https://doi.org/10.1027/1614-2241/a000172

This is a supplement to Methodology, the official journal of the European Association of Methodology (EAM), which is an international society devoted to the advancement of methodology in the empirical sciences. EAM activities include bi-annual conferences, the recent one held in Jena, Germany, from July 25 to 27, 2018. The scientific program and also many videos of presentations from this conference are still available for EAM members at http://www.metheval.uni-jena.de/eam2018/ (see http://www.eam-online.org/membership.php if you want to apply for membership). In publishing the videos of many presentations we followed a tradition already set up at the first EAM conference in 2004, which took place in Jena as well. These videos are still available at http://www.metheval.uni-jena.de/projekte/smabs2004/. Previous conferences had been held in Palma de Mallorca (2016), Utrecht (2014), Santiago de Compostela (2012), Potsdam (2010), Oviedo (2008), and Budapest (2006). The next one will take place in Valencia (July, 22–24, 2020; see https://esdeveniments.uv.es/22691/detail/european-congress-of-methodology.html).

As the local organizer, I have the honor of editing this supplement and presenting four of the conference’s highlights. Of course, selecting four papers out of so many is necessarily arbitrary to some extent, although not random. The Executive Committee of the EAM decided to publish two of the keynote presentations, one given by Larry Hedges and another one by Axel Mayer, and two papers by young scientists who have been honored by the Best Presentation Award, Leonie Vogelsmeier and Florian Scharf. (Unfortunately, the keynote submitted by Sophia Rabe-Hesketh about missing data was not available for publication in this supplement, but it is available as video.)

The Fallacy of Facts and Findings

Science is about facts, and findings should be replicable. Almost all of us would tend to subscribe. However, thinking about this statement for a while and looking at the world from a probabilistic point of view – and this is what we do as soon as we apply inferential statistics – that point of view turns out to be a highly questionable. From a probabilistic point of view, facts are not at all what science is about and findings are not necessarily expected to replicate. This statement is certainly provocative, but it might motivate to rethink what science, and in particular, empirical research is about. The first paper in this issue is “The statistics of replication” by Larry Hedges (2019) reminds us of what we have learned in our first course on inferential statistics. Scientific theories and hypotheses do not refer to facts and findings, that is to data and to significant or non-significant results; instead they refer to expectations, conditional expectations and their differences, true variances, true covariances or correlations, distributions and conditional distributions of random variables, or other probabilistic terms. Scientific theories and hypotheses, at least those that can empirically be investigated via inferential statistics, refer to random experiments and the parameters that describe the distributions of the random variables that can be considered in such a random experiment. For example, they do not refer to sample means and their differences, but to the underlying conditional expectations and their differences. These concepts are no facts at all; instead they are of a purely theoretical nature, and they are not observable. Their general definition requires the theory of integrals with respect to a probability measure (see, e.g., chapter 3 in Steyer & Nagel, 2017). Similarly, a finding such as “the sample mean difference is significant at the .05 level” is not really what we are interested in. From a scientific point of view, such a finding is just a random event, which occurs with a certain probability if we assume that a specific hypothesis is true. What is of interest instead is the difference between conditional expectation values. Larry Hedges’ paper explores the consequences of this point of view on the so-called replication crises and the methodology of replication studies.

In the second paper, “Causal effects based on latent variable models,” Axel Mayer amplifies this point (2019). Neither causal effects nor latent variables, nor causal effects on latent variables are entities that refer to something beyond a random experiment. Instead causal effects are parameters that are defined within or with explicit reference to a random experiment and the same applies to the values of latent variables. They are defined as specific functions of conditional expectations. However, their definition refers to probabilistic concepts that require some sophistication, which is not acquired in a single course on probability theory. This does not only apply to the definition of causal effects but also to the definition of latent variables. Nevertheless, Axel Mayer’s program EffectLiteR is a great example how highly sophisticated concepts such as causal effects and latent variables can lead to programs for data analysis that are easy to handle, even for applied researchers with a minimal background in probability theory.

In the third paper, “Continuous-time latent Markov factor analysis to explore longitudinal measurement invariance in experience sampling data,” Leonie Vogelsmeier and her coauthors (2019) lead us to another construction site of research methodology: continuous-time processes of latent variables. Again, we have to rethink our ways of thinking, our constructs, and the way we are doing research. Clearly, depression is not a value on a scale such as the Beck depression inventory. But is it a latent variable? Or is it rather a continuous-time process of latent variables? If the latter is true – and this is what I believe – then things are even more sophisticated and researchers will need years to learn the necessary background including continuous-time stochastic processes.

In the fourth paper, “A comparison of simple structure rotation criteria in temporal exploratory factor analysis for event-related potential data,” Scharf and Nestler (2019) deal with problems of reducing continuous-time data that are assessed in EEG research, comparing different rotation methods of exploratory factor analysis. Developing the methodology in this field seems important, considering the enormous complexity of these data structures.

The papers in this special issue and the complexity of the problems they address clearly indicate that we need to rethink the organization of study programs and of empirical research. The methodology of empirical research is far too complex to be taught in a few courses in substantive sciences such as psychology, sociology, or economy, for instance. Instead, methodology is a discipline of its own requiring new study programs in methodology in order to learn only the essentials. Similarly, in many cases, the methodological background of an empirical study is far too complex for most substantive researchers to fully grasp the meaning and the implications of their findings. This definitely necessitates increased cooperation between substantive scientists and methodologists in order to conduct and communicate empirical research.

References

  • Hedges, L. (2019). The statistics of replication. Methodology, 15(Suppl.), 3–15. https://doi.org/10.1027/1614-2241/a000173 First citation in articleLinkGoogle Scholar

  • Mayer, A. (2019). Causal effects based on latent variable models. Methodology, 15(Suppl.), 16–29. https://doi.org/10.1027/1614-2241/a000174 First citation in articleGoogle Scholar

  • Scharf, F., & Nestler, S. (2019). A comparison of simple structure rotation criteria in temporal exploratory factor analysis for event-related potential data. Methodology, 15(Suppl.), 44–61. https://doi.org/10.1027/1614-2241/a000175 First citation in articleGoogle Scholar

  • Steyer, R., & Nagel, W. (2017). Probability and conditional expectation. Fundamentals for the empirical sciences. Chichester, UK: Wiley. First citation in articleGoogle Scholar

  • Vogelsmeier, L., Vermunt, J.K., Böing-Messing, F., & De Roover, K. (2019). Continuous-time latent Markov factor analysis to explore longitudinal measurement invariance in experience sampling data. Methodology, 15(Suppl.), 30–43. https://doi.org/10.1027/1614-2241/a000176 First citation in articleGoogle Scholar

Rolf Steyer, Institute of Psychology, Friedrich Schiller University, Am Steiger 3, Haus 1, 07743 Jena, Germany, E-mail