Skip to main content
Open AccessTheoretical Article

Infusing Context Into Emotion Perception Impacts Emotion Decoding Accuracy

A Truth and Bias Model

Published Online:https://doi.org/10.1027/1618-3169/a000531

Abstract

Abstract. The accurate decoding of facial emotion expressions lies at the center of many research traditions in psychology. Much of this research, while paying lip service to the importance of context in emotion perception, has used stimuli that were carefully created to be deprived of contextual information. The participants' task is to associate the expression shown in the face with a correct label, essentially changing a social perception task into a cognitive task. In fact, in many cases, the task can be carried out correctly without engaging emotion recognition at all. The present article argues that infusing context in emotion perception does not only add an additional source of information but changes the way that participants approach the task by rendering it a social perception task rather than a cognitive task. Importantly, distinguishing between accuracy (perceiving the intended emotions) and bias (perceiving additional emotions to those intended) leads to a more nuanced understanding of social emotion perception. Results from several studies that use the Assessment of Contextual Emotions demonstrate the significance and social functionality of simultaneously considering emotion decoding accuracy and bias for social interaction in different cultures, their key personality and societal correlates, and their function for close relationships processes.

Most interactions – even trivial ones – are colored in emotion. Be it the salesperson trying to sound enthusiastic about a product or a loved one complaining about their problems, emotions play a central role in everyday human communication. As such, the ability to perceive and understand the emotions of others is a central social skill (Salovey & Mayer, 1989–1990). The accurate recognition of emotions (Emotion Decoding Accuracy, EDA)1 is central for the regulation of social and personal relationships (Manstead et al., 1999) because it helps coordination with others, communication in general, and provides the necessary affective glue in dyadic interactions (Feldman et al., 1991; Niedenthal & Brauer, 2012). Emotions can be expressed via voice, body postures, gestures (e.g., Bänziger et al., 2009), and even through touch (Hertenstein et al., 2006), yet one of the most prominent sources of emotion communication is the face. In what follows, we focus on facial expressions of emotions and their accurate decoding. However, it should be noted that the basic principle of what we propose is not in any way limited to facial affect but applies to emotion communication in general.

Although the central role of EDA is acknowledged in theoretical thinking, the available evidence regarding relationships between EDA and key psychological outcomes/correlates such as social relations or personality variables is limited (Elfenbein et al., 2002). In the present article, we argue that to reveal relationships between EDA and social relations outcomes such as interaction quality, conflict, negotiation tactics, or personality variables, EDA has to be conceived of as a social perception task rather than as a cognitive task and that this is achieved by measuring EDA in a social context. Specifically, we will argue that facial expressions can be decoded using two different ways, through pattern matching and through perspective taking, but that the commonly used approach to EDA only engages the cognitive route in tasks that are less likely to reflect social competencies. We propose that a task that infuses context into EDA better captures how observers perceive emotion expression. Furthermore, such an approach can distinguish between EDA (perceiving the intended emotions) and bias (perceiving additional emotions to those expressed) and thus is better suited to capture social emotion perception as it occurs in everyday life.

Results from several studies that assessed EDA using the Assessment of Contextual Emotions – a model and method that infuses context in emotion perception and distinguishes between EDA and bias – demonstrate the utility of this approach for understanding the role of EDA for social interaction in different cultures, its key personality and societal correlates, and its function for close relationships processes.

EDA Research: The Tradition

Research on EDA has traditionally focused on questions related to the universality of emotion expressions (see, e.g., Hess, 2017) or on how factors such as sex (Hall et al., 2000), cultural background (Elfenbein & Ambady, 2002), or psychiatric status (Penton-Voak et al., 2017) may influence accuracy. The goal of this line of research was to assess whether some groups make more errors in decoding than others, where errors are defined as selecting the wrong label for the expression shown. In these studies, participants were therefore presented with prototypical facial expressions drawn from standardized sets of facial expressions such the Pictures of Facial Affect (Ekman & Friesen, 1976) or with (facial) recognition tests such as the Diagnostic Analysis of Nonverbal Accuracy (Nowicki & Duke, 2001), or the Reading the Mind in the Eyes Test (Baron-Cohen et al., 2001). Participants were then asked to select from a list of emotion labels the one that best describes the depicted emotional expression. That is, accuracy is defined as the ability to associate one (correct) label with a single emotion expression shown without social context.

Notably, there is limited evidence for links between EDA assessed using these traditional tasks and social relations or personality outcomes that have been postulated to rely on EDA. Toner and Gates (1985) initially noted the scant success in finding evidence for a link between EDA and personality. Elfenbein et al. (2002) and Matsumoto et al. (2000) also emphasized the scarcity of findings, which often also did not replicate across studies. Even more discouraging, given that EDA is a central aspect of emotional intelligence (EI), is null or inconsistent evidence for a relationship between EDA and EI measured either as a trait (Matthews et al., 2015) or as an ability (Farrelly & Austin, 2007). Yet more concerning is an almost total lack of established EDA models that predict well-being. This is because, as outlined in the introduction, EDA is considered a key part of social interaction and quality of social interaction is a key correlate of well-being (Kawachi & Berkman, 2001).

More recently, evidence for a link between emotion recognition ability and alexithymia (Ihme et al., 2014; Jongen et al., 2014), as well as narcissism (Martins et al., 2019), and attachment style (Chris Fraley et al., 2006) has emerged, especially when considering extreme groups, but often only for some emotions and not for others. Thus, overall, the evidence for a link between EDA on the one hand and personality or social outcomes on the other is considerably less strong than theory (e.g., on emotional or social intelligence) would predict.

The present paper addresses this paradox. In particular, we outline two problems with the way EDA is traditionally measured and conceived of. First, the underlying definition of what constitutes accuracy in decoding emotion is limited. Second, the lack of social context in emotion decoding transforms those classic EDA tasks into a cognitive rather than a social perception task. We will then introduce a different approach, one that understands emotion perception as a contextualized social process in which accuracy can take different forms, and show evidence that EDA as measured by a social perception accuracy task such as the Assessment of Contextualized Emotion (ACE) can be meaningfully linked to both personality and social outcomes.

The Problem With Traditional EDA Models

A first question to address is the definition of accuracy. This question seems simple and straightforward at first glance, but in fact, scholars who have reflected on social perception accuracy have long recognized the difficulties surrounding an exact definition of accuracy (Funder, 1989; Kruglanski, 1989; Zaki & Ochsner, 2011a). The classic emotion recognition literature typically starts out with a judgment accuracy approach: A rating is considered accurate when the chosen label corresponds to the criterion label established by the researcher; otherwise, it is considered inaccurate. Yet, this definition of accuracy assumes that there is one and only one correct answer. That is, emotion expression is presumed to reflect a single “pure” emotion and that the decoders are accurate when they are able to label this one pure emotion.

We argue that the assumption that a single emotion label adequately describes an emotion expression is problematic. The general problem with any performance-based measure is the establishment of the correct answer or ground truth (Funder, 1995). That is, to decide whether a judgment is accurate, one needs to decide on a suitable criterion. For emotion expressions, there are several options. For example, the label can be derived from the expressive parameters for a given prototypical emotion described by Ekman and Friesen (1978). Alternatively, the label could be derived from the emotion the expresser felt during the expression (Levenson et al., 1991). The typical solution to the ground truth problem is to start with either one of these, that is, to either ask actors to pose a specific configuration or to induce a specific emotional state and select expressions according to these criteria. Since this approach usually leads to a range of expressive materials, a second validation phase is added where observers are asked to rate the expressions, and only those expressions for which a majority choose the same desired label are retained. That is, the recognition test will, in fact, be scored based on a consensus scoring procedure (see also Mayer et al., 2003).

This procedure implies two problematic issues. First, it is not clear that the portrayed expressions, whether posed to be prototypical or captured while a person reports feeling a specific emotion, are in fact pure representations of a given emotional state. For instance, people who experience emotions often report blends (Watson & Stanton, 2017). Second, even if one assumes that the researchers who created the test succeeded in capturing pure emotions, there is good evidence that they would not be perceived as such. In fact, observers tend to see multiple emotions even when judging emotional expressions considered to be pure (Russell et al., 1993; Russell & Fehr, 1987; Yrizarry et al., 1998). This is especially the case in naturally occurring social interactions where people are likely to show subtle expressions that are more open to different interpretations (Ekman, 2003; Motley & Camden, 1988). Thus, focusing on a single label does not provide a realistic measure of the actual perception.

These key limitations lead to a further problem associated with the response-rating method. In the existing methods, participants can typically choose only one label out of several; hence, only one form of inaccuracy can be assessed: mistaking one emotion for another. Lyusin and Ovsyannikova (2016) criticize this approach and suggest the use of a multidimensional response format or scalar rating scales (see also Matsumoto, 2005) where participants are asked to indicate all the emotions they can discern in an expression. This procedure captures the actual perception process much better and thus allows a different form of inaccuracy to be revealed – the common tendency mentioned above – to perceive more than one emotion in a given expression: in other words, to see emotions as mixed rather than as pure. However, this form of inaccuracy – contrary to the mislabeling of emotions in a forced choice task – does not necessarily result in a trade-off such that more accuracy automatically entrains less inaccuracy.

In fact, the tendency to inaccurately perceive such “secondary” emotions is, arguably, theoretically independent of accuracy for the target emotion. That is, the fact that someone perceives some level of sadness in an expression that is primarily considered angry does not have to impinge on the perception of anger. Yet, the fact that in this example sadness is also perceived is very relevant as there are good reasons why this tendency should show a link to individual differences as we will outline below.

Two Ways to Decode Emotion Expressions

Specifically, there are two ways to decode emotion in facial emotion expressions. First is pattern matching where specific features of the expression are associated with specific emotions (Buck, 1984). For example, upturned corners of the mouth or lowered eyebrows are recognized as smiles or frowns respectively and a perceiver can thus conclude that the individual is happy or angry. This process can be conceived of as a cognitive task that does not rely on the perceiver's wider social knowledge. The perceiver only has to match a label to a perceived constellation of features. In fact, not even that is always required as participants can often reduce the candidate emotions based on a single cue. For example, many positive emotions include a smile element (Shiota et al., 2003). On seeing the teeth, an observer may conclude that the target showed a positive emotion and then select the only emotion in the list that qualifies (Bänziger et al., 2009). As such, the classic EDA tasks listed above may assess emotion discrimination rather than emotion recognition, especially when the list of labels is short (Bänziger et al., 2009; Nelson & Russell, 2016).

However, there is a second process, which is based specifically on the perceiver's social knowledge: perspective taking. Knowing about the event that elicited an emotion allows people to use their naïve emotion theories to predict the most likely emotion that would follow such an event. When the event is unknown, such as in classic emotion recognition tasks, any social category that the perceiver is aware of and for which expectations regarding emotional reactions exist can influence emotion identification (Kirouac & Hess, 1999) in that the perceiver is more likely to attribute the more expected emotion evidenced in the ambiguous expression. For example, knowing that a (male) expresser is black or of high status leads observers to more readily label those social target's expression as angry (Hugenberg & Bodenhausen, 2003; Ratcliff et al., 2012). This second process engages participants' mental state attribution system (North et al., 2010; Zaki & Ochsner, 2011b). As this process is not solely based on the expressions shown but also involves the implicit social information provided by the image, the result is more likely to vary between participants and to lead to the differential attribution of secondary emotions (emotions that are not directly communicated by the target), in line with the observers' social knowledge and personality.

Importantly, however, to properly tap the effect of social knowledge on EDA, it does not suffice to simply assess secondary emotions. As noted above, secondary emotions are more likely to be perceived when participants use perspective taking in their efforts to understand others. Yet, this process depends on the availability of a social context.

Although it is widely acknowledged that emotion perception in real life rarely operates devoid of context (Barrett & Kensinger, 2010; Hess & Hareli, 2016), emotion perception research has typically used context-free facial expressions as stimuli. Even more surprising is that emotion research has largely ignored the most common form of context we experience in everyday life – other people. As emotions usually occur in (real or imagined) interactions, the presence of other people is a feature that is common to many emotion eliciting contexts. Nonetheless, the presence of others has only been considered from a cultural perspective (e.g., Kafetsios & Hess, 2015; Hess, Blaison, & Kafetsios, 2016; Masuda et al., 2008), when, in fact, it is a pervasive element of everyday interaction. The facial expressions of bystanders to an event may influence how the event itself is perceived (Hess et al., 2018), and the facial response of recipients of an expression can influence the meaning attributed to the expression (Hareli & David, 2017).

We argue that presenting participants with emotion expressions shown by a group of individuals provides a very relevant and important social framing for the task. Social framing fosters the use of perspective taking which, in turn, infuses the perception process with “biases” that reflect the personality and values of the perceiver. In this sense, biases are not so much errors as they constitute an expression of the perceiver's social cognition and personality. We will revisit this point below.

In sum, we have argued that perceiving expressions in social context changes the EDA task from a cognitive puzzle into a social task. Notably, we do not claim that people never utilize the cognitive puzzle approach in real life – they definitely do, for example, when they point out expressive features in a picture, such as a pleasant smile, or an ironic look; we maintain, however, that the more commonly used approach involves a predominance of perspective taking. We further propose that classic EDA tasks do not measure an important aspect of emotion perception, namely the attribution of secondary emotions. Yet, secondary emotions are an integral part of what people really perceive.

The ACE Model

Based on the above outlined considerations, we have proposed the ACE (Hess et al., 2016; Kafetsios & Hess, 2013, 2015) as a new model and method of EDA. ACE unites key considerations outlined above by infusing context in the process of decoding emotion, and by doing so, it allows for a more holistic consideration of accuracy, one that simultaneously assesses accuracy and bias.

ACE inserts context in EDA through the presentation of pictorial stimuli of naturalistic facial expressions of a group of three persons (the surrounding two persons express congruent or incongruent emotions to those expressed by the central person's expressions to be decoded; for example, see Figure 1). Specifically, while there are many possible contextual elements that can be imagined,2 a typical feature of most of these is the presence of others, which can serve to prime social processing modes. Observers rate these expressions on an emotion profile by indicating the intensity of a series of emotions using dimensional scales. Hence, the method permits the distinction between accurate evaluation of the presented focal emotions (accuracy) and the simultaneous evaluation of nonpresented, secondary, emotions (bias). Accuracy and bias are hypothesized, and found, to be largely independent EDA dimensions. Information about the ACE can be found in Hess et al. (2016) and Kafetsios and Hess (in press), and the ACE stimuli are available by request from the authors.

Figure 1 Example stimulus from the ACE faces. ACE = Assessment of Contextualized Emotion.

Accuracy and Bias

As noted above, in classic EDA research, the decoder is either right or wrong. Hence, inaccuracy is simply the proportion of responses that are not accurate. Yet, dominant models of social perception (Funder, 1995; West & Kenny, 2011) strongly maintain that accuracy and inaccuracy/bias in social perception are theoretically distinct processes and bias is distinct from error. In their seminal Truth and Bias model, West and Kenny (2011) specifically argue (p. 358) that “certain psychological mechanisms lead perceivers to be both accurate and biased, other mechanisms lead to more accuracy and less bias.” Much of the literature that inspired the Truth and Bias model and that it since has been applied to focuses on the social perception of personality traits (e.g., Overall et al., 2015). Yet, the model is very applicable to the perception of physical characteristics (cues), which also points to likely bridges between the physical and the social worlds in social perception (Zaki, 2013). We return to this point at the end of the paper.

The ACE embraces this view by considering both accuracy and bias in the perception of facial expression cues. Accuracy is defined as the rating of the target emotion within the profile. Thus, for an expression that is considered an exemplar of a sadness expression, the sadness ratings on an emotion profile are considered an index of accuracy. By contrast, the mean of all other emotion ratings on the profile is considered bias. Notably, bias does not mean that the rating is absolutely wrong. It may well be that a given emotion expression in context conveys more than one message. However, we assume that facial expression cues – like any message – have a main theme or information that they intend to convey and this would be captured in the accuracy rating.

In sum, the ACE approach to EDA aims to situate the task as a social task and hence to elicit social processing of the material – rather than their treatment as a cognitive puzzle. Doing so was predicted to lead to a measurement of EDA that preserves its theoretically posited relevance to EI and social functioning. We thus follow a more recent thinking about the dynamic nature of accuracy by using a measure that allows the separate measurement of accuracy and bias. We predict that EDA thus measured would allow us to reveal its relevance for social functioning and its groundedness in personality. In what follows, we want to summarize findings from our research that support this notion.

ACE and Social Functioning

Recent studies have demonstrated that ACE accuracy and bias have unique, measurable, and meaningful effects for social interaction. In three studies, two conducted in Greece and one in Germany (Hess et al., 2016), participants completed the ACE task in the laboratory (ACE cartoons in study 1 and ACE faces in studies 2 and 3) and then participated in a self-report event sampling diary of all dyadic interactions during a period of 10 days that lasted 10 minutes or more. Accuracy and Bias in the ACE meaningfully predicted self-reported parameters of interaction quality, whereas a standard emotion perception task, the MSCEIT faces (Mayer, Salovey, and Caruso Emotional Intelligence Test; Mayer et al., 2003) did not. Specifically, ACE accuracy was associated with higher quality indicators in social interaction with in-group members in Greece, whereas bias in ACE cartoons and ACE faces was associated with lower social interaction quality primarily within and outside more intimate relationships. In Germany, higher ACE faces accuracy was associated with all social interaction quality indicators across levels of intimacy (Hess et al., 2016). Importantly, ACE accuracy and bias had unique effects on social interaction quality, suggesting that one can be simultaneously both accurate and inaccurate. This point is further elaborated in the context of the Truth and Bias model of social perception (West & Kenny, 2011).

Moreover, bias as measured by the ACE was associated with alexithymia, the difficulty in identifying and describing emotion, and the two were found to contribute to problems in dyadic interactions and relationships (Kafetsios & Hess, 2019). Participants completed the Toronto Alexithymia Scale (TAS) and the ACEs in a laboratory session, followed by a 10-day event sampling study of the quality of their naturally occurring social interactions. The Difficulties in Identifying Feelings (DIF) subscale of the TAS negatively related to all indices of quality of social interaction and DIF was positively and moderately strongly correlated with bias in emotion perception using the ACE. Importantly, ACE bias mediated DIF effects on social interaction outcomes. It seems that bias as measured in the ACE can tap the lack of attunement in dyadic social interactions observed in people with alexithymia (e.g., Foran & O'Leary, 2012).

The notion that a contextualized test of EDA is associated with social interaction parameters leads to the very likely possibility that higher EDA can also contribute to overall well-being. In fact, there has been initial correlational evidence for a positive association between well-being (measured with Diener et al.'s 2010 flourishing scale) with ACE accuracy and a negative association with ACE bias (Kafetsios & Hess, in press). Following this up, we tested whether dyadic interaction quality during the week following the ACE assessment in the laboratory predicted well-being at the end of that week (Kafetsios et al., 2021). Results from Multilevel Structural Equation model analyses suggested direct associations between ACE accuracy and bias with well-being indicators in the expected direction and also indirect associations via social interaction quality during the week preceding the well-being assessment. The implications of this finding are important and point to a new way at looking at the lack of attunement in social relations and the effects this may have for rapport and responsiveness in social and personal relationships (Reis et al., 2017).

Elements of attunement were tested with a shorter version of the ACE faces of emotion regulation in 220 Greek dating couples (Papachiou et al., 2021). Both members of couples completed the ACE faces short, a 16-item version of the ACE faces, measures of intrapersonal and interpersonal emotion regulation (Gross & John, 2003; Little et al., 2012) and participated in a 10-day event sampling study of dyadic emotion regulation. ACE accuracy was negatively, and ACE bias was positively associated with dysfunctional intrapersonal and interpersonal emotion regulation strategies. Importantly, results from actor–partner interdependence models on the event sampling part of the study found both actor and partner ACE accuracy and bias effects on dyadic emotion regulation in theoretically meaningful ways.

Context and Personality in EDA

As noted above, classic EDA research strives to show expressions devoid of context. Yet, when considering real-life emotion communication, next to the expression itself, additional sources of information contribute to the contextual characteristics of the stimulus and the decoders' social schemas (e.g., Hess & Hareli, 2016). The latter are predicated on the decoder's previous social experiences, but also on their personality. To name an example, individuals with an insecure attachment style overattribute negative affect to peoples' faces, that is, they show a bias for the attribution of negative secondary emotions (Magai et al., 2000).

In fact, personality has to be understood as a system that mediates how the individual selects, construes, and processes social information and generates social behaviors (Mischel & Shoda, 1995). The Cognitive-Affective Processing System model posits social situation as a critical component of personality in that what happens in social situations very much reflects individual differences in cognitive-affective processing (Zayas et al., 2002). Personality characteristics are likely to emerge in particular situations, and it is information from both individual differences in personality and those situations that can provide predictions regarding personality–behavior links (Zayas et al., 2002). Such an approach reflects the meaning of Lewin's (1951) B = f(P,E) formula (see Funder, 2009). As such, presenting information in a socially disengaged manner, such as typical of the traditional EDA tasks, reduces its pertinence for personality-mediated processes.

A set of recent studies has highlighted this close link between personality and the contextually focused approach to EDA proposed by the ACE model (Kafetsios & Hess, in press). In seven studies conducted with different samples in Greece and in Germany, accuracy and inaccuracy in ACE faces were consistently associated with personality characteristics that tap the social domain (attachment orientations, emotion regulation strategies, cultural self-construal, self-reported EI, loneliness, alexithymia, and well-being). In many of those associations, there were unique effects of accuracy and inaccuracy further supporting the notion that the two EDA processes are largely independent from one another. Notably, in none of these studies did MSCEIT faces (Mayer et al., 2003) predict the personality correlates.

Importantly, a central question tested in all studies was whether a traditional hit rates approach – associating one (correct) label to a single emotion expression – can provide the same information as the accuracy and bias approach of the ACE. Across all studies, the assessment of accuracy and inaccuracy in a contextualized assessment of emotion was superior to simple hit rates in revealing associations with personality traits.

On the Path of Uniting Social Cognition and Accuracy Research

Theorizing and research in social perception has highlighted the important role that context holds for emotion perception. Some theorists have gone so far as to deny that emotion expressions are intrinsically anything but ambiguous and claim that they can only be correctly perceived as part of a given context (Hassin et al., 2013), whereas others have emphasized bidirectional influences between context and expression information (Hess & Hareli, 2018). By placing facial expressions into a context, specifically, by using stimuli that involve three people who are interacting and who show congruent or incongruent facial expressions, the ACE places the EDA task in a social frame which invites the use of social schemas in perspective taking. This novel approach to EDA opens up a number of possibilities to further link social cognition with accuracy processes, as already suggested a decade ago (Zaki & Ochsner, 2011b).

For example, the ACE approach can be used to assess the impact of social attribution processes on accuracy and bias. Appraisal theories of emotion (Ellsworth & Scherer, 2003) spell out the evaluation processes that underlie emotion elicitation. Conversely, observers can reconstruct these appraisals based on the expressions of the emoter (Hareli & Hess, 2010). In a situation where more than one person is shown, the expressions of witnesses to an emoter's emotional reactions to an event can influence the evaluation of both the event and the emoter (Hareli & David, 2017; Hess et al., 2018). These influences depend on the social knowledge and the engagement of the observer. As such, elements of the group context (i.e., the presence and specific emotion expressions of others and the congruence or incongruence between the emotions of the focal and the peripheral facial expressions) influence the perception of emotions in terms of both accuracy and bias, and this influence also meaningfully reflects individual differences. This could lead to advances in linking research on social cognition with accuracy research and ultimately with outcomes (Zaki & Ochsner, 2011a).

For example, given bidirectional relationships between context and social appraisal of emotion (Hess et al., 2019), one could vary the social situations and context cues that can give rise to different social appraisals to assess how levels of accuracy and bias and their combination are affected. Also, more elaborate analytic frames and models can be developed to allow a more exact mapping of how characteristics of the perceiver (e.g., social, individual, personality, socioeconomic) interact with contextual elements of the ACE to shape accuracy, bias, and their interrelationship. In both cases, application of social computational models (e.g., Ong et al., 2019) would greatly enhance the understanding of process – outcome relationships. A social computational approach could also shed light on the social perception processes responsible for the level of covariation between accuracy and bias, especially the conditions under which both target/social situation characteristics and perceiver characteristics (Hehman et al., 2017) contribute to the level of covariation between accuracy and bias. As discussed above, ACE accuracy and bias are theoretically independent, yet, depending on the specific experimental context, nontrivial covariations can be observed.

Conclusion

The present article argues that infusing context in emotion perception in the form of a group of other persons does not only add an additional source of information but changes the way that participants approach the task of emotion decoding by rendering it a social perception task rather than a cognitive task. The ACE approach to EDA presented in this paper constitutes a conceptual contribution to models of social and emotional perception. We provide evidence in line with conceptual arguments that accuracy and inaccuracy in emotion perception are theoretically distinct processes (Funder, 1995) and that accuracy and bias in social perception constitute two nonexclusive dimensions (West & Kenny, 2011; Zaki & Ochsner, 2011a). Such an approach also promotes a view of accuracy in terms of its utility for social emotion perception and its adaptive value (Kruglanski, 1989), as is evident in ACE accuracy and bias predicting several social functionality correlates.

Outstanding Questions

How do appraisals of the social situation influence ACE accuracy and bias and their covariation?

How can social perception processes inherent in ACE be computationally modeled to predict decoding accuracy and bias?

What are the behavioral proxies to accuracy and bias in dyadic interaction? (How) Do they mediate social functioning effects?

Can a contextualized view of emotion decoding revise our understanding of cultural differences in EDA?

References

  • Bänziger, T., Grandjean, D., & Scherer, K. R. (2009). Emotion recognition from expressions in face, voice, and body: The Multimodal Emotion Recognition Test (MERT). Emotion, 9(5), 691–704. 10.1037/a0017088 First citation in articleCrossref MedlineGoogle Scholar

  • Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I. (2001). The “reading the mind in the eyes” test revised version: A study with normal adults, and adults with Asperger syndrome or high‐functioning autism. Journal of Child Psychology and Psychiatry, 42(2), 241–251. 10.1017/S0021963001006643 First citation in articleCrossref MedlineGoogle Scholar

  • Barrett, L. F., & Kensinger, E. A. (2010). Context is routinely encoded during emotion perception. Psychological Science, 21(4), 595–599. 10.1177/0956797610363547 First citation in articleCrossref MedlineGoogle Scholar

  • Buck, R. (1984). Nonverbal receiving ability. In R. Buck (Ed.), The communication of emotion (pp. 209–242). Guilford Press. First citation in articleGoogle Scholar

  • Carroll, J. M., & Russell, J. A. (1996). Do facial expressions signal specific emotions? Judging emotion from the face in context. Journal of Personality and Social Psychology, 70(2), 205–218. 10.1037/0022-3514.70.2.205 First citation in articleCrossref MedlineGoogle Scholar

  • Chris Fraley, R., Niedenthal, P. M., Marks, M., Brumbaugh, C., & Vicary, A. (2006). Adult attachment and the perception of emotional expressions: Probing the hyperactivating strategies underlying anxious attachment. Journal of Personality, 74(4), 1163–1190. 10.1111/j.1467-6494.2006.00406.x First citation in articleCrossref MedlineGoogle Scholar

  • Diener, E., Wirtz, D., Tov, W., Kim-Prieto, C., Choi, D.-w., Oishi, S., & Biswas-Diener, R. (2010). New well-being measures: Short scales to assess flourishing and positive and negative feelings. Social Indicators Research, 97(2), 143–156. 10.1007/s11205-009-9493-y First citation in articleCrossrefGoogle Scholar

  • Ekman, P. (2003). Emotions revealed: Recognizing faces and feelings to improve communication and emotional life. Times Books. First citation in articleGoogle Scholar

  • Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Consulting Psychologists Press. First citation in articleGoogle Scholar

  • Ekman, P., & Friesen, W. V. (1978). The facial action coding system: A technique for the measurement of facial movement. Consulting Psychologists Press. First citation in articleGoogle Scholar

  • Elfenbein, H. A., & Ambady, N. (2002). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128(2), 203–235. 10.1037/0033-2909.128.2.203 First citation in articleCrossref MedlineGoogle Scholar

  • Elfenbein, H. A., Marsh, A., & Ambady, W. (2002). Emotional intelligence and the recognition of emotion from facial expressions. In L. F. BarrettP. Salovey (Eds.), The wisdom in feeling: Psychological processes in emotional intelligence (pp. 37–59). The Guilford Press. First citation in articleGoogle Scholar

  • Ellsworth, P.C., & Scherer, K.R. (2003). Appraisal processes in emotion. In R.J. DavidsonH. GoldsmithK.R. Scherer (Eds.), Handbook of the affective sciences (pp. 572–595). Oxford University Press. First citation in articleGoogle Scholar

  • Farrelly, D., & Austin, E. J. (2007). Ability EI as an intelligence? Associations of the MSCEIT with performance on emotion processing and social tasks and with cognitive ability. Cognition and Emotion, 21(5), 1043–1063. 10.1080/02699930601069404 First citation in articleCrossrefGoogle Scholar

  • Feldman, R. S., Philippot, P., & Custrini, R. J. (1991). Social competence and nonverbal behavior. In R. S. FeldmanB. Rime (Eds.), Fundamentals of nonverbal behavior (pp. 319–350). Cambridge University Press. First citation in articleGoogle Scholar

  • Foran, H. M., & O'Leary, K. D. (2012). The role of relationships in understanding the alexithymia–depression link. European Journal of Personality, 27(5), 470–480. 10.1002/per.1887 First citation in articleCrossrefGoogle Scholar

  • Funder, D. C. (1989). Accuracy in personality judgment and the dancing bear. In D. M. BussN. Cantor (Eds.), Personality psychology: Recent trends and emerging directions (pp. 210–223). Springer-Verlag. 10.1007/978-1-4684-0634-4_16 First citation in articleCrossrefGoogle Scholar

  • Funder, D. C. (1995). On the accuracy of personality judgment: A realistic approach. Psychological Review, 102(4), 652–670. 10.1037/0033-295X.102.4.652 First citation in articleCrossref MedlineGoogle Scholar

  • Funder, D. C. (2009). Persons, behaviors and situations: An agenda for personality psychology in the postwar era. Journal of Research in Personality, 43(2), 120–126. 10.1016/j.jrp.2008.12.041 First citation in articleCrossrefGoogle Scholar

  • Gendron, M., Roberson, D., van der Vyver, J. M., & Barrett, L. F. (2014). Perceptions of emotion from facial expressions are not culturally universal: Evidence from a remote culture. Emotion, 14(2), 251–262. 10.1037/a0036052 First citation in articleCrossref MedlineGoogle Scholar

  • Gray, K. L. H., Barber, L., Murphy, J., & Cook, R. (2017). Social interaction contexts bias the perceived expressions of interactants. Emotion, 17(4), 567–571. 10.1037/emo0000257 First citation in articleCrossref MedlineGoogle Scholar

  • Gross, J. J., & John, O. P. (2003). Individual differences in two emotion regulation processes: Implications for affect, relationships, and well-being. Journal of Personality and Social Psychology, 85(2), 348–362. 10.1037/0022-3514.85.2.348 First citation in articleCrossref MedlineGoogle Scholar

  • Hall, J. A., Andrzejewski, S. A., & Yopchick, J. E. (2009). Psychosocial correlates of interpersonal sensitivity: A meta-analysis. Journal of Nonverbal Behavior, 33(3), 149–180. 10.1007/s10919-009-0070-5 First citation in articleCrossrefGoogle Scholar

  • Hall, J. A., Carter, J. D., & Horgan, T. G. (2000). Gender differences in nonverbal communication of emotion. In A. H. Fischer (Ed.), Gender and emotion: Social psychological perspectives (pp. 97–117). Cambridge University Press. 10.1017/cbo9780511628191.006 First citation in articleCrossrefGoogle Scholar

  • Hareli, S., & David, S. (2017). The effect of reactive emotions expressed in response to another's anger on inferences of social power. Emotion, 17(4), 717–727. 10.1037/emo0000262 First citation in articleCrossref MedlineGoogle Scholar

  • Hareli, S., & Hess, U. (2010). What emotional reactions can tell us about the nature of others: An appraisal perspective on person perception. Cognition & Emotion, 24(1), 128–140. 10.1080/02699930802613828 First citation in articleCrossrefGoogle Scholar

  • Hassin, R. R., Aviezer, H., & Bentin, S. (2013). Inherently ambiguous: Facial expressions of emotions, in context. Emotion Review, 5(1), 60–65. 10.1177/1754073912451331 First citation in articleCrossrefGoogle Scholar

  • Hehman, E., Sutherland, C. A. M., Flake, J. K., & Slepian, M. L. (2017). The unique contributions of perceiver and target characteristics in person perception. Journal of Personality and Social Psychology, 113(4), 513–529. 10.1037/pspa0000090 First citation in articleCrossref MedlineGoogle Scholar

  • Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., & Jaskolka, A. R. (2006). Touch communicates distinct emotions. Emotion, 6(3), 528–533. 10.1037/1528-3542.6.3.528 First citation in articleCrossref MedlineGoogle Scholar

  • Hess, U. (2017). Emotion categorization. In C. LefebvreH. Cohen (Eds.), Handbook of categorization in cognitive science (2nd ed., pp. 107–126). Elsevier. 10.1016/b978-0-08-101107-2.00005-1 First citation in articleCrossrefGoogle Scholar

  • Hess, U., Blaison, C., & Kafetsios, K. (2016). Judging facial emotion expressions in context: The influence of culture and self-construal orientation. Journal of Nonverbal Behavior, 40(1), 55–64. 10.1007/s10919-015-0223-7 First citation in articleCrossrefGoogle Scholar

  • Hess, U., Dietrich, J., Kafetsios, K., Elkabetz, S., & Hareli, S. (2019). The bidirectional influence of emotion expressions and context: Emotion expressions, situational information and real-world knowledge combine to inform observers' judgments of both the emotion expressions and the situation. Cognition and Emotion, 34(3), 539–552. 10.1080/02699931.2019.1651252 First citation in articleCrossref MedlineGoogle Scholar

  • Hess, U., & Hareli, S. (2016). The impact of context on the perception of emotions. In C. AbellJ. Smith (Eds.), The expression of emotion: Philosophical, psychological, and legal perspectives (pp. 199–218). Cambridge University Press. First citation in articleCrossrefGoogle Scholar

  • Hess, U., & Hareli, S. (2018). On the malleability of the meaning of contexts: The influence of another person's emotion expressions on situation perception. Cognition and Emotion, 32(1), 185–191. https://doi.org/10.1080/02699931.2016.1269725 First citation in articleGoogle Scholar

  • Hess, U., Kafetsios, K., Mauersberger, H., Blaison, C., & Kessler, C.-L. (2016). Signal and noise in the perception of facial emotion expressions: From labs to life. Personality and Social Psychology Bulletin, 42(8), 1092–1110. 10.1177/0146167216651851 First citation in articleCrossref MedlineGoogle Scholar

  • Hess, U., Landmann, H., David, S., & Hareli, S. (2018). The bidirectional relation of emotion perception and social judgments: The effect of witness' emotion expression on perceptions of moral behaviour and vice versa. Cognition and Emotion, 32(6), 1152–1165. 10.1080/02699931.2017.1388769 First citation in articleCrossref MedlineGoogle Scholar

  • Hugenberg, K., & Bodenhausen, G. V. (2003). Facing prejudice: Implicit prejudice and the perception of facial threat. Psychological Science, 14(6), 640–643. 10.1046/j.0956-7976.2003.psci_1478.x First citation in articleCrossref MedlineGoogle Scholar

  • Ickes, W. (1997). Empathic accuracy. Guilford. First citation in articleGoogle Scholar

  • Ihme, K., Sacher, J., Lichev, V., Rosenberg, N., Kugel, H., Rufer, M., Grabe, H.-J., Pampel, A., Lepsien, J., Kersting, A., Villringer, A., Lane, R. D., & Suslow, T. (2014). Alexithymic features and the labeling of brief emotional facial expressions – An fMRI study. Neuropsychologia, 64, 289–299. 10.1016/j.neuropsychologia.2014.09.044 First citation in articleCrossref MedlineGoogle Scholar

  • Jongen, S., Axmacher, N., Kremers, N. A. W., Hoffmann, H., Limbrecht-Ecklundt, K., Traue, H. C., & Kessler, H. (2014). An investigation of facial emotion recognition impairments in alexithymia and its neural correlates. Behavioural Brain Research, 271, 129–139. 10.1016/j.bbr.2014.05.069 First citation in articleCrossref MedlineGoogle Scholar

  • Kafetsios, K., Dostal, D., Seitl, M., & Hess, U. (2021). A contextualized emotion perception test relates to well-being: Social interaction as a mediator. Unpublished manuscript. First citation in articleGoogle Scholar

  • Kafetsios, K., & Hess, U. (2013). Effects of Activated and dispositional self-construal on emotion decoding Accuracy. Journal of Nonverbal Behavior, 37(3), 191–205. 10.1007/s10919-013-0149-x First citation in articleCrossrefGoogle Scholar

  • Kafetsios, K., & Hess, U. (2015). Are you looking at me? The influence of facial orientation and cultural focus salience on the perception of emotion expressions. Cogent Psychology, 2(1), 1005493. 10.1080/23311908.2015.1005493 First citation in articleCrossrefGoogle Scholar

  • Kafetsios, K., & Hess, U. (2019). Seeing mixed emotions: Alexithymia, emotion perception bias, and quality in dyadic interactions. Personality and Individual Differences, 137, 80–85. 10.1016/j.paid.2018.08.014 First citation in articleCrossrefGoogle Scholar

  • Kafetsios, K., & Hess, U. (in press). Personality and the accurate perception of facial emotion expressions: What is accuracy and how does it matter? Emotion. 10.1037/emo0001034 First citation in articleCrossrefGoogle Scholar

  • Kawachi, I., & Berkman, L. F. (2001). Social ties and mental health. Journal of Urban Health: Bulletin of the New York Academy of Medicine, 78(3), 458–467. 10.1093/jurban/78.3.458 First citation in articleCrossref MedlineGoogle Scholar

  • Kirouac, G., & Hess, U. (1999). Group membership and the decoding of nonverbal behavior. In P. PhilippotR. FeldmanE. Coats (Eds.), The social context of nonverbal behavior (pp. 182–210). Cambridge University Press. First citation in articleGoogle Scholar

  • Kruglanski, A. W. (1989). The psychology of being “right”: The problem of accuracy in social perception and cognition. Psychological Bulletin, 106(3), 395–409. 10.1037/0033-2909.106.3.395 First citation in articleCrossrefGoogle Scholar

  • Levenson, R. W., Carstensen, L. L., Friesen, W. V., & Ekman, P. (1991). Emotion, physiology, and expression in old age. Psychology and Aging, 6(1), 28–35. 10.1037/0882-7974.6.1.28 First citation in articleCrossref MedlineGoogle Scholar

  • Lewin, K. (1951). Field theory in social science. Harper. First citation in articleGoogle Scholar

  • Little, L. M., Kluemper, D., Nelson, D. L., & Gooty, J. (2012). Development and validation of the interpersonal emotion management scale. Journal of Occupational and Organizational Psychology, 85(2), 407–420. 10.1111/j.2044-8325.2011.02042.x First citation in articleCrossrefGoogle Scholar

  • Lyusin, D., & Ovsyannikova, V. (2016). Measuring two aspects of emotion recognition ability: Accuracy vs. sensitivity. Learning and Individual Differences, 52, 129–136. 10.1016/j.lindif.2015.04.010 First citation in articleCrossrefGoogle Scholar

  • Magai, C., Hunziker, J., Mesias, W., & Culver, L. C. (2000). Adult attachment styles and emotional biases. International Journal of Behavioral Development, 24(3), 301–309. 10.1080/01650250050118286 First citation in articleCrossrefGoogle Scholar

  • Manstead, A. S. R., Fischer, A. H., & Jakobs, E. (1999). The social and emotional functions of facial displays. In P. PhilippotR. S. FeldmanE. J. Coats (Eds.), The social context of nonverbal behavior. Studies in emotion and social interaction (pp. 287–313). Cambridge University Press. First citation in articleGoogle Scholar

  • Martins, A. T., Ros, A., Valério, L., & Faísca, L. (2019). Basic emotion recognition according to clinical personality traits. Current Psychology, 38(3), 879–889. 10.1007/s12144-017-9661-1 First citation in articleCrossrefGoogle Scholar

  • Masuda, T., Ellsworth, P. C., Mesquita, B., Leu, J., Tanida, S., & Van de Veerdonk, E. (2008). Placing the face in context: Cultural differences in the perception of facial emotion. Journal of Personality and Social Psychology, 94(3), 365–381. 10.1037/0022-3514.94.3.365 First citation in articleCrossref MedlineGoogle Scholar

  • Matsumoto, D. (2005). Scalar ratings of contempt expressions. Journal of Nonverbal Behavior, 29, 91–104. 10.1007/s10919-005-2742-0 First citation in articleCrossrefGoogle Scholar

  • Matsumoto, D., LeRoux, J., Wilson‐Cohn, C., Raroque, J., Kooken, K., Ekman, P., Yrizarry, N., Loewinger, S., Uchida, H., Yee, A., Amo, L., & Goh, A. (2000). A new test to measure emotion recognition ability: Matsumoto and Ekman's Japanese and Caucasian Brief Affect Recognition Test (JACBERT). Journal of Nonverbal Behavior, 24, 179–209. 10.1023/A:1006668120583 First citation in articleCrossrefGoogle Scholar

  • Matthews, G., Pérez-González, J.-C., Fellner, A. N., Funke, G. J., Emo, A. K., Zeidner, M., & Roberts, R. D. (2015). Individual differences in facial emotion processing: Trait emotional intelligence, cognitive ability, or transient stress? Journal of Psychoeducational Assessment, 33(1), 68–82. 10.1177/0734282914550386 First citation in articleCrossrefGoogle Scholar

  • Mayer, J. D., Salovey, P., Caruso, D. R., & Sitarenios, G. (2003). Measuring emotional intelligence with the MSCEIT V2.0. Emotion, 3(1), 97–105. 10.1037/1528-3542.3.1.97 First citation in articleCrossref MedlineGoogle Scholar

  • Mischel, W., & Shoda, Y. (1995). A cognitive-affective system theory of personality: Reconceptualizing situations, dispositions, dynamics, and invariance in personality structure. Psychological Review, 102(2), 246–268. 10.1037/0033-295x.102.2.246 First citation in articleCrossref MedlineGoogle Scholar

  • Motley, M. T., & Camden, C. T. (1988). Facial expression of emotion: A comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting. Western Journal of Speech Communication, 52(1), 1–22. 10.1080/10570318809389622 First citation in articleCrossrefGoogle Scholar

  • Nelson, N. L., & Russell, J. A. (2016). A facial expression of pax: Assessing children's “recognition” of emotion from faces. Journal of Experimental Child Psychology, 141, 49–64. 10.1016/j.jecp.2015.07.016 First citation in articleCrossref MedlineGoogle Scholar

  • Niedenthal, P. M., & Brauer, M. (2012). Social functionality of human emotion. Annual Review of Psychology, 63(1), 259–285. 10.1146/annurev.psych.121208.131605 First citation in articleCrossref MedlineGoogle Scholar

  • North, M. S., Todorov, A., & Osherson, D. N. (2010). Inferring the preferences of others from spontaneous, low-emotional facial expressions. Journal of Experimental Social Psychology, 46(6), 1109–1113. 10.1016/j.jesp.2010.05.021 First citation in articleCrossrefGoogle Scholar

  • Nowicki, S., Jr., & Duke, M. P. (2001). Nonverbal receptivity: The diagnostic Analysis of nonverbal Accuracy (DANVA). In J. A. HallF. J. Bernieri (Eds.), Interpersonal sensitivity: Theory and measurement (pp. 183–198). Erlbaum. First citation in articleGoogle Scholar

  • Ong, D. C., Zaki, J., & Goodman, N. D. (2019). Computational models of emotion inference in theory of mind: A review and roadmap. Topics in Cognitive Science, 11(2), 338–357. 10.1111/tops.12371 First citation in articleCrossref MedlineGoogle Scholar

  • Overall, N. C., Fletcher, G. J. O., Simpson, J. A., & Fillo, J. (2015). Attachment insecurity, biased perceptions of romantic partners' negative emotions, and hostile relationship behavior. Journal of Personality and Social Psychology, 108(5), 730–749. 10.1037/a0038987 First citation in articleCrossref MedlineGoogle Scholar

  • Papachiou, A., Hess, U. & Kafetsios, K. (2021). Emotion decoding accuracy and emotion regulation in couples. Unpublished manuscript. First citation in articleGoogle Scholar

  • Penton-Voak, I. S., Munafò, M. R., & Looi, C. Y. (2017). Biased facial-emotion perception in mental health disorders: A possible target for psychological intervention? Current Directions in Psychological Science, 26(3), 294–301. 10.1177/0963721417704405 First citation in articleCrossrefGoogle Scholar

  • Ratcliff, N. J., Franklin, R. G., Nelson, A. J., Jr., & Vescio, T. K. (2012). The scorn of status: A bias toward perceiving anger on high-status faces. Social Cognition, 30(5), 631–642. 10.1521/soco.2012.30.5.631 First citation in articleCrossrefGoogle Scholar

  • Reis, H. T., Lemay, E. P., Jr., & Finkenauer, C. (2017). Toward understanding understanding: The importance of feeling understood in relationships. Social and Personality Psychology Compass, 11(3), article e12308. 10.1111/spc3.12308 First citation in articleCrossrefGoogle Scholar

  • Righart, R., & de Gelder, B. (2008). Rapid influence of emotional scenes on encoding of facial expressions: An ERP study. Social Cognitive and Affective Neuroscience, 3(3), 270–278. 10.1093/scan/nsn021 First citation in articleCrossref MedlineGoogle Scholar

  • Russell, J. A., & Fehr, B. (1987). Relativity in the perception of emotion in facial expressions. Journal of Experimental Psychology: General, 116(3), 223–237. 10.1037/0096-3445.116.3.223 First citation in articleCrossrefGoogle Scholar

  • Russell, J. A., Suzuki, N., & Ishida, N. (1993). Canadian, Greek, and Japanese freely produced emotion labels for facial expressions. Motivation and Emotion, 17(4), 337–351. 10.1007/BF00992324 First citation in articleCrossrefGoogle Scholar

  • Salovey, P., & Mayer, J. D. (1989–1990). Emotional intelligence. Imagination, Cognition and Personality, 9(3), 185–211. 10.2190/DUGG-P24E-52WK-6CDG First citation in articleCrossrefGoogle Scholar

  • Shiota, M. N., Campos, B., & Keltner, D. (2003). The faces of positive emotion: Prototype displays of awe, amusement, and pride. Annals of the New York Academy of Sciences, 1000(1), 296–299. 10.1196/annals.1280.029 First citation in articleCrossref MedlineGoogle Scholar

  • Toner, H. L., & Gates, G. R. (1985). Emotional traits and recognition of facial expression of emotion. Journal of Nonverbal Behavior, 9(1), 48–66. 10.1007/BF00987558 First citation in articleCrossrefGoogle Scholar

  • Watson, D., & Stanton, K. (2017). Emotion blends and mixed emotions in the hierarchical structure of Affect. Emotion Review, 9(2), 99–104. 10.1177/1754073916639659 First citation in articleCrossrefGoogle Scholar

  • West, T. V., & Kenny, D. A. (2011). The truth and bias model of judgment. Psychological Review, 118(2), 357–378. 10.1037/a0022936 First citation in articleCrossref MedlineGoogle Scholar

  • Yrizarry, N., Matsumoto, D., & Wilson-Cohn, C. (1998). American-Japanese differences in multiscalar intensity ratings of universal facial expressions of emotion. Motivation and Emotion, 22(4), 315–327. 10.1023/A:1021304407227 First citation in articleCrossrefGoogle Scholar

  • Zaki, J. (2013). Cue integration: A common framework for social cognition and physical perception. Perspectives on Psychological Science, 8(3), 296–312. 10.1177/1745691613475454 First citation in articleCrossref MedlineGoogle Scholar

  • Zaki, J., & Ochsner, K. (2011a). Reintegrating the study of accuracy into social cognition research. Psychological Inquiry, 22(3), 159–182. 10.1080/1047840X.2011.551743 First citation in articleCrossrefGoogle Scholar

  • Zaki, J., & Ochsner, K. (2011b). You, me, and my brain: Self and other representations in social cognitive neuroscience. In A. TodorovS. T. FiskeD. A. Prentice (Eds.), Social neuroscience: Toward understanding the underpinnings of the social mind (pp. 14–39). Oxford University Press. 10.1093/acprof:oso/9780195316872.003.0002 First citation in articleCrossrefGoogle Scholar

  • Zayas, V., Shoda, Y., & Ayduk, O. N. (2002). Personality in context: An interpersonal systems perspective. Journal of Personality, 70(6), 851–900. 10.1111/1467-6494.05026 First citation in articleCrossref MedlineGoogle Scholar

1For EDA, we consider conceptualizations and operationalizations that specifically assess the decoding accuracy of emotion in facial emotion expressions and exclude related approaches that do not (e.g., such as the Ickes empathic accuracy paradigm [Ickes, 1997] or other measures of interpersonal sensitivity, typically self-report methods (see Hall et al., 2009), for a comprehensive review).

2Recently, scholars have come to appreciate the significance of social context in emotion perception (Gendron et al., 2014). The perception of emotion expressions is influenced by the concurrent social interactants (Gray et al., 2017), other interacting (Hess & Hareli, 2018) and noninteracting facial expressions (Masuda et al., 2008), situational stories (Carroll & Russell, 1996), visual scenes (Righart & de Gelder, 2008), and body postures (Hassin et al., 2013).