Skip to main content
Open AccessOriginal Article

Where salience goes beyond authenticity

A meta-analysis on simulation-based learning in higher education

Published Online:https://doi.org/10.1024/1010-0652/a000357

Abstract

Abstract: This meta-analysis builds on and extends Chernikova, Heitzmann, Stadler, et al.’s (2020) meta-analysis on simulation-based learning in higher education by summarizing the findings of 214 empirical studies. The main focus of the current meta-analysis was to investigate the role of prior knowledge and design features, namely (1) the authenticity of simulated scenarios (as both functional correspondence with a real task and a physical resemblance of the environment) and (2) the salience of relevant information in moderating the effects of simulation-based learning on facilitating complex skills in higher education. The analysis identified that the salience of relevant information contributes significantly to explaining the variation in the effectiveness of simulation-based learning environments. We conclude that authenticity and salience can be assessed independently of each other, and both can be considered important design features in creating effective simulation-based learning environments to meet the needs of learners with different levels of prior knowledge.

Wo Salienz über Authentizität hinausgeht: Eine Meta-Analyse zum simulationsbasierten Lernen in der Hochschulbildung

Zusammenfassung: Diese Metaanalyse baut auf der vorausgegangenen Metanalyse zum simulationsbasierten Lernen in der Hochschulbildung (Chernikova, Heitzmann, Stadler, et al., 2020) auf und erweitert diese, indem sie die Ergebnisse von 214 empirischen Studien zusammenfasst. Der Hauptfokus der aktuellen Metaanalyse lag darauf, die moderierende Rolle von Vorwissen und den Lernumgebungsmerkmalen Authentizität simulierter Szenarien (sowohl als funktionale Übereinstimmung mit der realen Aufgabe als auch als physische Ähnlichkeit) und Salienz relevanter Informationen bei der Förderung komplexer Fähigkeiten in der Hochschulbildung zu untersuchen. Die Analysen zeigten, dass Salienz relevanter Informationen wesentlich dazu beiträgt, die Unterschiede in der Effektivität simulationsbasierter Lernumgebungen zu erklären, sogar über Authentizität hinaus. Wir kommen zu dem Schluss, dass Authentizität und Salienz unabhängig voneinander bewertet werden und als wichtige Designmerkmale bei der Gestaltung effektiver simulationsbasierter Lernumgebungen berücksichtigt werden können.

Problem statement

Empirical research provides supportive evidence of the effectiveness of learning with simulations in different domains of higher education (e.g., Chernikova, Heitzmann, Stadler, et al., 2020; Cook et al., 2013; Hegland et al., 2017; Theelen et al., 2019). Simulations can facilitate a wide range of cognitive processes (e.g., Szulewski et al., 2019) and the motivational components of learning (e.g., Helle et al., 2011). However, research also shows that there is no single solution for making simulation-based learning environments effective or for fitting the needs of different learners. First, trained skills can be very different across and within disciplines (e.g., resource management, identifying learning difficulties or misconceptions, history taking, presenting bad news, diagnosing, interpreting radiology images, or calculating drug dosage). Second, a broad range of factors might influence the effectiveness of learning environments (e.g., Heitzmann et al., 2019), including instructional support measures (e.g., scaffolding), learner prerequisites (e.g., prior knowledge, motivation), type of trained skills (e.g., diagnosing), and features of the learning environment itself (e.g., technology use). Many of these factors were explored in a recent meta-analysis by Chernikova, Heitzmann, Stadler, et al. (2020). This research has indicated that although tasks from different disciplines seem different, there are important commonalities in the ways in which the complex cognitive skills underlying these tasks (e.g., problem solving) can be trained. This, in turn, makes the transfer of good practices and effective solutions promising across domains. However, a lot of variance in the effectiveness of simulations remained unexplained in the research. Therefore, this meta-analysis investigated some further design factors in more detail.

The current study focuses on the salience of information as a design feature, which, together with the authenticity of learning situations, can be helpful in further explaining the effects of simulations in higher education. A list of primary studies can be found in the electronic supplementary material (ESM) 1. This assumption relates to one of the key features of simulations: the opportunity to approximate learning settings for practice purposes (e.g., Grossman et al., 2009; Codreanu et al., 2020). This can be done by upholding the complexity of the real situation while highlighting particular parts to make them easier to understand and learn. The correspondence of learning settings and actual tasks/practice scenarios is conjectured to be central (Morris et al., 1977), but supporting learners with different means of instructional support is also seen as crucial for effective learning (e.g., Belland et al., 2017). One of the main functions of instructional support is to direct learners' attention to the particular information or procedures needed (i.e., the salience of specific, relevant information for the learner) to deal with a learning scenario. However, it remains unclear whether additional instruction can harm the authenticity of a learning situation. In this case, increasing the salience of information in a simulation can be associated with decreased authenticity and effectiveness. Alternatively, both the salience of information and authenticity can be used to contribute to effective learning independently of each other.

Theoretical background

Simulation-based learning in higher education

Simulations are increasingly often and effectively used in many different higher education settings, such as STEM education (e.g., D’Angelo et al., 2016), medicine and nursing (e.g., Cook et al., 2013; Hegland et al., 2017; White et al., 2018), teacher education (e.g., Codreanu et al., 2021; Kramer et al., 2020; Theelen et al., 2019), and engineering and management (e.g., Alfred & Chung, 2011). The average effect of simulations on facilitating complex skills in higher education is large (Chernikova, Heitzmann, Stadler, et al., 2020). However, little is known about the factors (e.g., design principles) that contribute to these highly positive effects. Therefore, systematic research about the effective design of simulation-based learning environments with a focus on particularly effective scenarios and design features can contribute greatly to the interdisciplinary and cross-domain implementations of simulation-based learning and the transfer of discoveries and good practices from one field of research to another (Heitzmann et al., 2019).

Authenticity of learning environments

Simulation-based learning can be viewed as an approximation of practice (Grossman et al., 2009). It is therefore important to consider the extent to which a simulation represents real-world practice in relation to the requirements placed on the learner, the type of simulated situation, and the environment as well as the participants involved (e.g., Allen et al., 1991). With regard to academic professions, practice opportunities, including authentic problems like those in a professional field, are considered beneficial (e.g., Barab et al., 2000). Likewise, authenticity can be defined as the “degree of resemblance” (Gulikers et al., 2004, p. 67) or correspondence between the characteristics of a learning environment and the characteristics of the actual tasks in higher education and professional life (Hamstra et al., 2014). These characteristics and, consequently, their authenticity, can be visible on a physical level (i.e., physical resemblance) – for example, if a training room looks like a school classroom for pre-service teachers or like a physician's office or operation room for medical students – or remain at the task level (i.e., functional correspondence) – for example, when the simulated tasks correspond to the actual ones, like grading student tests or looking for specific symptoms in a patient's medical history.

Empirical research has provided mixed evidence regarding the ways in which authenticity can affect learning. On one hand, full authenticity has not always been shown to be beneficial to learning (e.g., Henninger & Mandl, 2000). On the other hand, increased authenticity is associated with increased knowledge gains in learners with both more and less prior knowledge (e.g., Chernikova, Heitzmann, Stadler, et al., 2020).

There are several possible reasons for this ambiguity in the findings. First, there is disagreement in defining what correspondence between real practice and learning environments is, as well as using different terms (sometimes interchangeably) like fidelity, authenticity, and realistic design of simulations. Some researchers (e.g., Hamstra et al., 2014) have claimed that functional correspondence is more important than physical resemblance for facilitating complex skills; however, systematic research is scarce. Second, simulations designed to be highly authentic are not always perceived as such by learners. One of the possible reasons for this is that novices do not know the relevant aspects of practice as well as experts do.

To address the complexity of the authenticity construct, this meta-analysis includes both aspects of authenticity: functional correspondence between a simulated scenario and the context of real situations, and the physical resemblance of the elements of the learning environment. We also explicitly distinguish between these two aspects of authenticity to determine their role in designing simulation-based learning environments.

Salience of relevant information

Although we admit that the salience of information can hardly be viewed independently of learners' prerequisites (e.g., prior knowledge), the current study focuses on the salience of information as a design feature and defines it as the visibility of relevant information to the learner in the learning situation (Machts et al., 2023). Relevant information, in turn, is needed to make the decision or solve the simulated scenario. For the purpose of this meta-analysis, we utilized the concept of salience as a feature of the stimuli (i.e., relevant information for a particular task), which could be generalized across participating individuals (Higgins, 1996). In other words, we assume that designers of learning environments can make relevant information more or less visible to learners, independently of their prior knowledge or experience, by using design principles within simulation-based learning environments.

The salience of information, in turn, can vary naturally (e.g., due to the nature of the situation) or be manipulated (enhanced or decreased) via instructional support. Enhancing salience is typically associated with reducing the complexity of a task or scenario and making relevant information more visible to facilitate learning [e.g., the decomposition and approximation of practice in Grossman et al. (2009)]. We assume that an artificial increase or decrease in salience might, at some point, lead to a reduction in the authenticity of learning situations.

Role of prior knowledge

Professional knowledge can be regarded as a fundamental prerequisite for the further development of skills and competences (e.g., Sweller, 2005) as well as one of the most relevant predictors for the effectiveness of particular instructional support measures (e.g., Chernikova, Heitzmann, Fink, et al., 2020). The current meta-analysis study follows the approach described in previous meta-analyses by Chernikova, Heitzmann, Fink, et al. (2020) and Chernikova, Heitzmann, Stadler, et al. (2020) to define prior knowledge through the learners' familiarity with the context, as reported in the primary studies.

We assume that prior knowledge can moderate the effects of authenticity and salience within simulation-based learning environments on learning gains (e.g., the development of complex skills). Learners with superior prior knowledge can better deal with the complexities of authentic scenarios (e.g., Chernikova, Heitzmann, Stadler, et al., 2020); their prior knowledge can also compensate for the lack of salience of relevant information (i.e., enable them to find information hidden within the learning environment). Learners with little prior knowledge, in turn, would be more dependent on the direct visibility of relevant information and might be overwhelmed by abstract scenarios (i.e., those with low authenticity and/or little physical resemblance).

Instructional support

Instructional support can address many different goals. It can be implemented to facilitate engagement (e.g., Casey et al., 2014), reduce the complexity of a task by modeling target behavior or providing worked-out examples (e.g., Renkl, 2014), prompt different processes (e.g., Quintana et al., 2004) or provide the opportunity to reflect on past or intended steps (e.g., Mann et al., 2009), and give feedback on different features of performance (e.g., Narciss, 2013). Empirical research also provides evidence of the positive effects of scaffolding on learning (e.g., Belland et al., 2017; Gegenfurtner et al., 2014). The meta-analysis by Chernikova, Heitzmann, Fink, et al. (2020) suggested that hardly any interventions in higher education aimed at fostering complex skills, and competences were performed without at least some kind of instructional support (e.g., scaffolding, feedback, or explicit presentation of additional information).

We assume that making relevant information more salient with instructional support can compensate for a lack of prior knowledge in learners and enhance learning gains. However, simplifying or oversimplifying the task as a result of such instructional support could reduce the authenticity of the learning scenario. Thus, since both authenticity and scaffolding seem to be relevant for learning, there remain open questions about how these two characteristics of simulated learning environments interact with each other and whether and how prior knowledge moderates those interactions.

Research questions

RQ1: How do authenticity features (functional correspondence and physical resemblance) contribute to the effectiveness of a simulation-based learning environment?

In line with the initial findings of Chernikova, Heitzmann, Stadler, et al.’s (2020) meta-analysis, we expect that a higher degree of authenticity will be associated with better learning gains (i.e., advancement of complex skills). Furthermore, we assume that functional correspondence will play a greater role than physical resemblance in explaining the effectiveness of simulations (e.g., Hamstra et al., 2014).

RQ2: To what extent does the salience of information contribute to learning gains?

We expect that increased salience of relevant information (originating from features of the real task or artificially increased through instructional support) will be associated with increased learning gains.

RQ3: To what extent does prior knowledge moderate the effects of salience and authenticity on learning gains?

In line with prior research (Chernikova, Heitzmann, Stadler, et al., 2020), more authenticity, independent of the level of prior knowledge, is expected to be beneficial. Less authentic scenarios might be more challenging for learners with little prior knowledge than for those with great prior knowledge. We assume that the level of salience will have different effects on learners with different levels of prior knowledge. For example, if salience is low (i.e., relevant information requires specific knowledge and skills to be noticed), learners with higher prior knowledge will be able to benefit more from such simulations than learners with low prior knowledge. Furthermore, in line with previous research on simulation-based learning (e.g., Chernikova, Heitzmann, Stadler, et al., 2020), we assume that learners with low prior knowledge will benefit the most from enhanced salience of relevant information (i.e., higher levels of instructional support aimed at increasing the salience of relevant information).

Method

Literature search and screening: Inclusion and exclusion criteria

The current meta-analysis updates the findings of Chernikova, Heitzmann, Stadler, et al. (2020) by fully adopting the previous study's search strategies and eligibility criteria and performing a search for studies that appeared in selected databases (PsycINFO, PsycARTICLES, ERIC, and MEDLINE) from April 2018 to 20 December 2020. The keywords used were simulat*, competenc*, skill*, teach*, and medic*.

This meta-analysis included studies that focused on complex skills, such as diagnosis, decision-making, problem solving, and planning. Studies that focused solely on manual/motor skills were excluded. Only studies that used simulations to facilitate knowledge, skills, or competencies and reported objective measures of learning outcomes were included. Studies also had to report the “no simulation” control conditions (e.g., pre-test or control group) and relevant statistical values (Figure 1).

Figure 1 Study selection procedure

Coding procedures

Quality of coding was reached through multiple iterations, including coder training (N = 6) and estimating of agreement. Within the coder training, 50% of the primary studies were double-coded (with a Cohen's kappa above 0.85). All discrepancies were discussed until a final agreement of 100% was reached, after which all studies (including training material) were coded independently by the same author and student research assistants. Studies that did not provide enough information for an unambiguous decision were excluded from the respective parts of the analysis.

To address the research questions of the current study, the following features were coded:

Authenticity. Functional correspondence was coded as low if the simulation (task, scenario) resembled reality (real tasks and activities) to a low degree and high if the tasks were identical or very similar to real tasks or activities (e.g., teaching in a simulated classroom, using high-fidelity simulators in medicine, or whole facility simulations with different professionals involved in real time). Examples of high and low functional correspondence were taken from a previous meta-analysis and integrated into the coding scheme. Physical resemblance was coded high if the simulation used real or realistic equipment was performed in a room similar to the real environment, or real people/actors were involved. Physical resemblance was coded low if the simulation settings were not reflective of realistic situations to a high degree (e.g., using virtual tools or interacting with objects or avatars, which one usually does not come across in reality).

Salience of information was coded high if all the relevant information (for performing tasks) was easily accessible, or low if relevant information was hidden within the learning context but could be accessed with specific actions or if it required specific knowledge to become accessible. If studies explicitly mentioned that instructional support was provided to make relevant information visible (prompts, feedback, etc.), salience was coded as enhanced. Note that enhanced salience refers to initially low salience that was artificially enhanced with instructional support to become high, and not to naturally occurring high salience (due to the nature of the situation).

Prior knowledge (familiarity of context) was coded as high if studies mentioned that learners were trained in a familiar context (e.g., new relevant concepts, procedures, or performed similar tasks before), low if learners were trained in an unfamiliar context (e.g., new situations or tasks that were never performed before), or mixed if some learners in the group were familiar with the context and some were not (e.g., if students and professionals were trained in one simulation).

Additionally, the following control variables were coded: type and year of publication, domain, and study design (e.g., experimental, quasi-experimental, or one-group design). Type of control was coded as baseline if the effects of the simulations were controlled by pre-tests only, pure if control conditions had no instructions (e.g., waiting control), or instructed if control conditions received other types of instructional support on the same topic but had no simulations.

Statistical analysis

A random-effects model and Hedges g estimation were applied (Borenstein et al., 2009; Schmidt & Hunter, 2014) with corrections for sample size, correlated samples (Tanner-Smith et al., 2016), and pre-test differences between groups. Procedures to trace and correct publication biases (e.g., funnel plot-based, see Figure 3) were performed as part of the preliminary analysis (Carter et al., 2017). We used the “metafor” and “robumeta” packages in R-Studio (Version 1.3.1093, 2020) to perform the analysis.

Figure 3 Funnel plot with (a) and without (b) outliers (N = 74)

Results

After deleting duplicates, the search resulted in 2,373 articles about medical and teacher education, counseling, engineering, management, and nursing. During abstract and full-text screening, studies that did not meet the inclusion criteria were excluded; all other studies were included in the next step of screening and analysis. In total, 79 studies met the eligibility criteria (Figure 1). No publication bias was detected, but five studies, being outliers and reporting effect sizes over g = 4, were eliminated from the analysis (see Figure 3). This resulted in 74 studies used for the analysis (see Figure 2 for an overview), resulting in 246 comparisons (Table E1 in ESM 1). Also, studies used in the meta-analysis by Chernikova, Heitzmann, Stadler, et al. (2020) were re-analyzed to address our research questions (N = 140; five studies with g = 4 were excluded to maintain consistency). The total sample comprised 214 studies (654 comparisons).

Figure 2 Forest plot without outliers (N = 74)

The meta-regression on control variables (year of publication, publication type, study design, type of control, and domain) showed that these factors did not explain any statistically significant amount of variance between study effects. The model, which included prior knowledge as the only moderator, explained 0.91% of the variance. Adding authenticity to the model increased the amount of explained variance by up to 1.65%, and the model that included prior knowledge, authenticity, and salience of relevant information accounted for 8.5% of the variance. The summary effect of simulation-based learning on complex skills compared to no simulation conditions with RVE correction resulted in Hedges g = 0.94, SE = 0.07, p < 0.001, and N = 214 (large effect). Effects were highly heterogeneous [Q(df = 653)= 7751.74; p < 0.001; I2 = 95.67%], suggesting an analysis of potential moderators.

RQ1: Effects of authenticity

The effects of the simulations associated with both high functional correspondence (g = 1.01; SE = 0.07; p < 0.001; N = 173) and low functional correspondence (g = 0.58; SE = 0.19; p < 0.001; N = 23) reached statistical significance. The difference between the effects in favor of high authenticity was statistically significant (p = 0.028). The level of physical resemblance of the learning scenario did not have a significant impact on the effects of the simulation; the difference in the effects was non-significant (p = 0.152). High resemblance yielded g = 1.04, SE = 0.08, p < 0.001, and N = 112; low resemblance yielded g = 0.84, SE = 0.11, p < 0.001, and N = 11.

RQ2: Effects of salience

Simulations with low salience of information reached significantly lower effects than those with high salience of information (the difference was significant at p = 0.001). The effects of enhanced salience point to the possibly high benefits of instructional support. The effects of high salience were g = 1.09, SE = 0.11, p < 0.001, and N = 77; of low salience, g = 0.60, SE = 0.07, p < 0.001, and N = 71; and of enhanced salience, g = 1.56, SE = 0.15, p < 0.001, and N = 31. The difference between the effects of high and enhanced salience also reached statistical significance (p = 0.014).

RQ3: Role of prior knowledge and interaction effects

In line with previous research, simulation-based learning was found to be effective for all groups of learners, independent of their prior knowledge (differences between the groups were statistically non-significant). Simulations had average effects of g = 0.99, SE = 0.09, p < 0.001, and N = 92 on learners with high prior knowledge and of g = 0.87, SE = 0.10, p < 0.001, and N = 104 on learners with low prior knowledge. The results for the mixed groups were inconclusive. Therefore, mixed groups were excluded from further analyses.

The interaction between the effects of prior knowledge and simulation features can be found in Tables 1 and 2. The results emphasize the importance of high physical resemblance for learners with high prior knowledge and the positive effects of salience enhanced through scaffolding for all groups of learners.

Table 1 Effects of authenticity on the learning gains of learners with different levels of prior knowledge
Table 2 Added value of salience for learners with different levels of prior knowledge

Discussion

This meta-analysis aimed at exploring the role of simulation features' authenticity and salience, together with a learning prerequisite, prior knowledge, in facilitating complex skills in higher education. The results indicate that the model with the above-mentioned features can explain around 8.5% of variance in the effectiveness of simulations and can provide insights for the design of future effective simulations.

In line with previous research (e.g., Chernikova, Heitzmann, Stadler, et al., 2020), the current meta-analysis found that high authenticity of simulations is associated with higher learning gains. Additionally, although only a limited number of studies were found with low authenticity, the results indicate that simulations with low authenticity can also be effective for learners with high prior knowledge. A look at the proportion of variance that is explained by authenticity, however, is not impressive. Authenticity on its own can account for only about 2% of the variance in the effectiveness of simulations, which stands in some contrast to the role that it plays in the discussion around designing learning environments (e.g., Andersson & Andersson, 2005; Strobel et al., 2013) – simulation-based learning in particular. Several factors explain much more variance [e.g., scaffolding (Belland et al., 2017), type of simulation (Chernikova, Heitzmann, Stadler, et al., 2020), and feedback (Wisniewski et al., 2020)].

What contributes even more and beyond authenticity to explaining the variability of the acquired skills in simulation-based learning is the salience of relevant information. Higher salience, seen as a higher visibility of relevant information, is associated with higher learning gains for both low and high authenticity and independent from prior knowledge. This finding illustrates a positive example of saliency bias (e.g., Bordalo et al., 2012), as the most visible information is simultaneously the information relevant for decision-making and solving of the problem. Salience enhanced through instructional support has the potential to at least double learning gains for developing complex skills like diagnosing or problem solving (if compared with low salience) in highly authentic simulations for those with both low and high levels of prior knowledge.

The instructional dilemma of maintaining authenticity and making simulated scenarios accessible to learn from seems to be resolved by the possibility of reducing cognitive overload through authentic information by artificially (i.e., instructionally) increasing the salience of relevant information. Therefore, given the design choice of reducing authenticity versus increasing information salience, the results of this study suggest increasing salience and keeping authenticity at a high level.

This meta-analysis also contributes to the discussion initiated by Hamstra et al. (2014). The authors claim that functional correspondence might be a more important feature in reflecting authenticity than physical resemblance. Our analysis is in line with this claim. Taking the same level of functional correspondence, physical resemblance does not contribute significantly to differentiating between effects if the level of prior knowledge is low. However, we also discovered that low physical resemblance can be critical for learners with high prior knowledge. No clear recommendation is possible at this point due to insufficient data.

Limitations and further research

Several limitations must be considered. First, the categorization of authenticity and salience as high and low is relatively coarse-grained and may be confounded with some implicit variables within the simulation-based learning environments used in primary studies, such as the sequencing of scenarios of different typicality or the multi-media design of the studies. It is also possible that we underestimated the variance explained by authenticity because the sample of primary studies did not contain much variance. Measuring authenticity in meta-analyses is also a challenging task and may have validity problems. Second, the descriptions of the learning environments in the primary studies were often quite scarce and did not allow for conducting some parts of the analysis (e.g., exploring effects of salience for mixed groups). Third, this research addresses only cognitive measures of learning outcomes, not motivational-affective outcomes or process measures, which might also benefit from high authenticity (e.g., Casey et al., 2014).

Furthermore, more insights for theory and practice can be obtained with a finer-grained analysis of different fields and disciplines (e.g., simulations in the fields of radiology, nursing, special education, etc.) and specific skills trained (e.g., communication skills, technical performance, etc.). In our view, it is promising to collect and aggregate good practices and evidence from different fields to facilitate further transfer.

Conclusion

The results of the current meta-analysis show that authenticity and salience can be assessed independently of each other and both can be considered important design features in creating effective simulation-based learning environments.

Although more research is necessary for a finer-grained differentiation of the different aspects of authenticity and salience for designing learning scenarios [e.g., task, activity, and information levels; see Machts et al., (2023)], this study can already give some insights for the effective design of simulation-based learning environments. For learners with high prior knowledge, physical resemblance of the learning settings is crucial, but they seem to be able to manage the low salience of relevant information and apply their knowledge and expertise to get the information they need. Learners with low prior knowledge are less influenced by the physical resemblance of the learning settings if the tasks are relevant, similar to the actual ones, and the relevant information is highlighted.

Electronic supplementary material

The electronic supplementary material (ESM) is available with the online version of the article at https://doi.org/10.1024/1010-0652/a000357

References

  • Alfred, M. & Chung, C. (2011). Design, development, and evaluation of a second generation interactive Simulator for Engineering Ethics Education (SEEE2). Science and Engineering Ethics, 18(4), 689–697. http://dx.doi.org/10.1007/s11948-011-9284-0 First citation in articleCrossrefGoogle Scholar

  • Allen, J. A. , Buffardi, L. C. & Hays, R. T. (1991). The relationship of simulator fidelity to task and performance variables (Report No. ARI-91–58). Army Research Institute for the Behavioral and Social Sciences. https://doi.org/10.21236/ADA238941 First citation in articleCrossrefGoogle Scholar

  • Andersson, S. & Andersson, I. (2005). Authentic learning in a sociocultural framework: A case study on non-formal learning. Scandinavian Journal of Educational Research, 49(4), 419–436. https://doi.org/10.1080/00313830500203015 First citation in articleCrossrefGoogle Scholar

  • Barab, S. A. , Squire, K. D. & Dueber, W. (2000). A co-evolutionary model for supporting the emergence of authenticity. Educational Technology Research and Development, 48(2), 37–62. https://doi.org/10.1007/BF02313400 First citation in articleCrossrefGoogle Scholar

  • Belland, B. R. , Walker, A. E. , Kim, N. J. & Lefler, M. (2017). Synthesizing results from empirical research on computer-based scaffolding in STEM education: A meta-analysis. Review of Educational Research, 87(2), 309–344. https://doi.org/10.3102/0034654316670999 First citation in articleCrossrefGoogle Scholar

  • Bordalo, P. , Gennaioli, N. & Shleifer, A. (2012). Salience theory of choice under risk. The Quarterly Journal of Economics, 127(3), 1243–1285. http://qje.oxfordjournals.org/cgi/doi/10.1093/qje/qjs018 First citation in articleCrossrefGoogle Scholar

  • Borenstein, M. , Hedges, L. V. , Higgins, J. P. & Rothstein, H. R. (2009). Introduction to meta-analysis. Wiley. https://doi.org/10.1002/9780470743386 First citation in articleCrossrefGoogle Scholar

  • Carter, E. , Schönbrodt, F. , Gervais, W. M. & Hilgard, J. (2017). Correcting for bias in psychology: A comparison of meta-analytic methods. Advances in Methods Practices in Psychological Science, 2(2), 115–144. https://doi.org/10.1177/2515245919847196 First citation in articleCrossrefGoogle Scholar

  • Casey, M. M. , Bates, S. P. , Galloway, K. W. , Galloway, R. K. , Hardy, J. A. , Kay, A. E. , Kirsop, P. & McQueen, H. A. (2014). Scaffolding student engagement via online peer learning. European Journal of Physics, 35(4), 045002. https://doi.org/10.1088/0143-0807/35/4/045002 First citation in articleCrossrefGoogle Scholar

  • Chernikova, O. , Heitzmann, N. , Fink, M. C. , Timothy, V. , Seidel, T. & Fischer, F. (2020). Facilitating diagnostic competences in higher education – A meta-analysis in medical and teacher education. Educational Psychology Review, 32(1), 157–196. https://doi.org/10.1007/s10648-019-09492-2 First citation in articleCrossrefGoogle Scholar

  • Chernikova, O. , Heitzmann, N. , Stadler, M. , Holzberger, D. , Seidel, T. & Fischer, F. (2020). Simulation-based learning in higher education: A meta-analysis. Review of Educational Research, 90(4), 499–541. https://doi.org/10.3102/0034654320933544 First citation in articleCrossrefGoogle Scholar

  • Codreanu, E. , Sommerhoff, D. , Huber, S. , Ufer, S. & Seidel, T. (2020). Between authenticity and cognitive demand: Finding a balance in designing a video-based simulation in the context of mathematics teacher education. Teaching and Teacher Education, 95, 103146. https://doi.org/10.1016/j.tate.2020.103146 First citation in articleCrossrefGoogle Scholar

  • Codreanu, E. , Sommerhoff, D. , Huber, S. , Ufer, S. & Seidel, T. (2021). Exploring the process of preservice teachers' diagnostic activities in a video-based simulation. Frontiers in Education, 6(133). https://doi.org/10.3389/feduc.2021.626666 First citation in articleCrossrefGoogle Scholar

  • Cook, D. , Hamstra, S. , Brydges, R. , Zendejas, B. , Szostek, J. , Wang, A. , Erwin, P. & Hatala, R. (2013). Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Medical Teacher, 35(1), 872–883. https://doi.org/10.3109/0142159X.2012.714886 First citation in articleCrossrefGoogle Scholar

  • D'Angelo, C. M. , Rutstein, D. W. & Harris, C. J. (2016). Learning with STEM Simulations in the Classroom: Findings and Trends from a Meta-Analysis. Educational Technology archive, 56(3), 58–61. https://www.learntechlib.org/p/175765/ First citation in articleGoogle Scholar

  • Gegenfurtner, A. , Quesada-Pallarès, C. & Knogler, M. (2014). Digital simulation-based training: A meta-analysis. British Journal of Educational Technology, 45(6), 1097–1114. https://doi.org/10.1111/bjet.12188 First citation in articleCrossrefGoogle Scholar

  • Grossman, P. , Compton, C. , Igra, D. , Ronfeldt, M. , Shahan, E. & Williamson, P. (2009). Teaching practice: A cross-professional perspective. Teachers College Record, 111(9), 2055–2100. https://doi.org/10.1177/016146810911100905 First citation in articleCrossrefGoogle Scholar

  • Gulikers, J. T. M. , Bastiaens, T. J. & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(1), 67–85. https://doi.org/10.1007/BF02504676 First citation in articleCrossrefGoogle Scholar

  • Hamstra, S. , Brydges, R. , Hatala, R. , Zendejas, B. & Cook, D. (2014). Reconsidering fidelity in simulation-based training. Academic Medicine, 89(3), 387–392. https://doi.org/10.1097/ACM.0000000000000130 First citation in articleCrossrefGoogle Scholar

  • Hegland, P. A. , Aarlie, H. , Strømme, H. & Jamtvedt, G. (2017). Simulation-based training for nurses: Systematic review and meta-analysis. Nurse Education Today, 54(1), 6–20. https://doi.org/10.1016/j.nedt.2017.04.004 First citation in articleCrossrefGoogle Scholar

  • Heitzmann, N. , Seidel, T. , Opitz, A. , Hetmanek, A. , Wecker, C. , Fischer, M. , Ufer, S. , Schmidmaier, R. , Neuhaus, B. , Siebeck, M. , Stürmer, K. , Obersteiner, A. , Reiss, K. , Girwidz, R. & Fischer, F. (2019). Facilitating diagnostic competences in simulations: A conceptual framework and a research agenda for medical and teacher education. Frontline Learning Research, 7(4), 1–24. https://doi.org/10.14786/flr.v7i4.384 First citation in articleCrossrefGoogle Scholar

  • Helle, L. , Nivala, M. , Kronqvist, P. , Gegenfurtner, A. , Björk, P. & Säljö, R. (2011). Traditional microscopy instruction versus process-oriented virtual microscopy instruction: A naturalistic experiment with control group. Diagnostic Pathology, 6(1), 81–89. https://doi.org/10.1186/1746-1596-6-S1-S8 First citation in articleCrossrefGoogle Scholar

  • Henninger, M. & Mandl, H. (2000). Vom Wissen zum Handeln – ein Ansatz zur Förderung kommunikativen Handelns [From knowledge to action – An approach for fostering communicative behavior]. In H. Mandl J. Gerstenmaier (Eds.), Die Kluft zwischen Wissen und Handeln (pp. 197–219). Hogrefe. First citation in articleGoogle Scholar

  • Higgins, E. T. (1996). Knowledge activation: Accessibility, and salience. In E. T. Higgins A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133–168). Guilford Press. First citation in articleGoogle Scholar

  • Kramer, M. , Förtsch, C. , Stürmer, J. , Förtsch, S. , Seidel, T. & Neuhaus, B. J. (2020). Measuring biology teachers' professional vision: Development and validation of a video-based assessment tool. Cogent Education, 7(1). https://doi.org/10.1080/2331186X.2020.1823155 First citation in articleCrossrefGoogle Scholar

  • Machts, N. , Chernikova, O. , Jansen, T. , Weidenbusch, M. , Fischer, F. & Möller, J. (2023). Categorization of simulated diagnostic situations and the salience of diagnostic information – Conceptual framework. Zeitschrift für Pädagogische Psychologie. https://doi.org/10.1024/1010-0652/a000364 First citation in articleLinkGoogle Scholar

  • Mann, K. , Gordon, J. & MacLeod, A. (2009). Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education, 14(4), 595–621. https://doi.org/10.1007/s10459-007-9090-2 First citation in articleCrossrefGoogle Scholar

  • Morris, C. D. , Bransford, J. D. & Franks, J. J. (1977). Levels of processing versus transfer appropriate processing. Journal of Verbal Learning and Verbal Behavior, 16(5), 519–533. First citation in articleCrossrefGoogle Scholar

  • Narciss, S. (2013). Designing and evaluating tutoring feedback strategies for digital learning. Digital Education Review, 23(1), 7–26. First citation in articleGoogle Scholar

  • Quintana, C. , Reiser, B. J. , Davis, E. A. , Krajcik, J. , Fretz, E. , Duncan, R. G. , Kyza, E. , Edelson, D. , Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386. https://www.tandfonline.com/doi/abs/10.1207/s15327809jls1303_4 First citation in articleCrossrefGoogle Scholar

  • Renkl, A. (2014). Toward an instructionally oriented theory of example-based learning. Cognitive Science, 38(1), 1–37. https://onlinelibrary.wiley.com/doi/full/10.1111/cogs.12086 First citation in articleCrossrefGoogle Scholar

  • Schmidt, F. L. & Hunter, J. E. (2014). Methods of meta-analysis: Correcting error and bias in research findings (3rd ed.). Sage. First citation in articleGoogle Scholar

  • Strobel, J. , Wang, J. , Weber, N. R. & Dyehouse, M. (2013). The role of authenticity in design-based learning environments: The case of engineering education. Computers in Education, 64(5), 143–152. https://doi.org/10.1016/j.compedu.2012.11.026 First citation in articleCrossrefGoogle Scholar

  • Sweller, J. (2005). Implications of cognitive load theory for multimedia learning. The Cambridge handbook of multimedia learning, 3(2), 19–30. First citation in articleCrossrefGoogle Scholar

  • Szulewski, A. , Braund, H. , Egan, R. , Gegenfurtner, A. , Hall, A. K. , Howes, D. , Dagnone, J. D. & Van Merriënboer, J. J. G. (2019). Starting to think like an expert: An analysis of resident cognitive processes during simulation-based resuscitation examinations. Annals of Emergency Medicine, 74(5), 647–659. https://doi.org/10.1016/j.annemergmed.2019.04.002 First citation in articleCrossrefGoogle Scholar

  • Tanner-Smith, E. , Tipton, E. & Polanin, J. (2016). Handling complex meta-analytic data structures using robust variance estimates: A tutorial, Journal of Developmental and Life-Course Criminology, 2(1), 85–112. h ttps://link.springer.com/article/10.1007/s40865-016-0026-5 First citation in articleCrossrefGoogle Scholar

  • Theelen, H. , Beemt, A. V. & Brok, P. D. (2019). Classroom simulations in teacher education to support preservice teachers' interpersonal competence: A systematic literature review. Computers and Education, 129, 14–26. https://doi.org/10.1016/j.compedu.2018.10.015 First citation in articleCrossrefGoogle Scholar

  • White, M. R. , Braund, H. , Howes, D. , Egan, R. , Gegenfurtner, A. , Van Merriënboer, J. J. G. & Szulewski, A. (2018). Getting inside the expert's head: An analysis of physician cognitive processes during trauma resuscitations. Annals of Emergency Medicine, 72(3), 289–298. https://doi.org/10.1016/j.annemergmed.2018.03.005 First citation in articleCrossrefGoogle Scholar

  • Wisniewski, B. , Zierer, K. & Hattie, J. (2020). The power of feedback revisited: A meta-analysis of educational feedback research. Frontiers in Psychology, 10, 3087. https://doi.org/10.3389/fpsyg.2019.03087 First citation in articleCrossrefGoogle Scholar