Abstract
The multimodal nature of human communication has been well established. Yet few empirical studies have systematically examined the widely held belief that this form of perception is facilitated in comparison to unimodal or bimodal perception. In the current experiment we first explored the processing of unimodally presented facial expressions. Furthermore, auditory (prosodic and/or lexical-semantic) information was presented together with the visual information to investigate the processing of bimodal (facial and prosodic cues) and multimodal (facial, lexic, and prosodic cues) human communication. Participants engaged in an identity identification task, while event-related potentials (ERPs) were being recorded to examine early processing mechanisms as reflected in the P200 and N300 component. While the former component has repeatedly been linked to physical property stimulus processing, the latter has been linked to more evaluative “meaning-related” processing. A direct relationship between P200 and N300 amplitude and the number of information channels present was found. The multimodal-channel condition elicited the smallest amplitude in the P200 and N300 components, followed by an increased amplitude in each component for the bimodal-channel condition. The largest amplitude was observed for the unimodal condition. These data suggest that multimodal information induces clear facilitation in comparison to unimodal or bimodal information. The advantage of multimodal perception as reflected in the P200 and N300 components may thus reflect one of the mechanisms allowing for fast and accurate information processing in human communication.
References
2002). Neural systems for recognizing emotion. Current Opinion in Neurobiology, 12, 69–177.
(2001). Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature, 411, 305–309.
(1991). Guidelines for standard electrode position nomenclature. Journal of Clinical Neurophysiology, 8, 200–202.
. (2004). Time course and specificity of event related potentials to emotional expressions. Neuroreport, 15, 211–216.
(2003). Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. International Journal of Psychophysiology, 49, 67–74.
(2003). Early processing of the six basic facial emotional expressions. Brain Research. Cognitive Brain Research, 17, 613–620.
(2000). Relationships among facial, prosodic, and lexical channels of emotional perceptual processing. Cognition and Emotion, 14, 193–211.
(2004). Recognition of affective prosody: Continuous wavelet measures of event-related brain potentials to emotional exclamations. Psychophysiology, 41, 259–268.
(1998). Crossmodal identification. Trends in Cognitive Science, 2, 247–253.
(2002). Discrimination of emotional facial expressions in a visual oddball task: An ERP study. Biological Psychiatry, 59, 171–186.
(2008). The processing of audio-visual speech: Empirical and neural bases. Philosophical Transactions of the Royal Society London B Biological Science, 363, 1001–1010.
(1995). An ERP study on the specificity of facial expression processing. International Journal of Psychophysiology, 19, 183–192.
(1997). A study on the emotional-processing of visual stimuli through event-related potentials. Brain and Cognition, 34, 207–217.
(1997). N300, P300 and the emotional processing of visual stimuli. Electroencephalography and Clinical Neurophysiology, 103, 298–303.
(2001). Emotion, attention, and the “negativity bias,” studied through event-related potentials. International Journal of Psychophysiology, 41(1), 75–85.
(2003). Multisensory integration, perception and ecological validity. Trends in Cognitive Science, 7, 460–467.
(2000). The perception of emotions by ear and by eye. Cognition and Emotion, 14, 289–311.
(1999). The combined perception of emotion from voice and face: Early interaction revealed by human electric brain responses. Neuroscience Letters, 260, 133–136.
(1999). Seeing cries and hearing smiles: Crossmodal perception of emotional expressions. In , Cognitive contributions to the perception of spatial and temporal events. Advances in Psychology, 129 (pp. 425–438). Amsterdam: North-Holland/Elsevier.
(2002). An ERP study on the time course of emotional face processing. Neuroreport, 13, 427–431.
(2003). The role of spatial attention in the processing of facial expression: An ERP study of rapid brain responses to six basic emotions. Cognitive, Affective, and Behavioral Neuroscience, 3, 97–110.
(1992). An argument for basic emotions. Cognition and Emotion, 6, 169–200.
(2002). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128, 203–235.
(2000). Facial expressions of emotion: Are angry faces detected more efficiently? Cognition and Emotion, 14, 61–92.
(2002). Picture the difference: Electrophysiological investigations of picture processing in the two cerebral hemispheres. Neuropsychologia, 40, 730–747.
(1999). Auditory-visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study. Journal of Cognitive Neuroscience, 11, 473–490.
(1959). On methods in the analysis of profile data. Psychometrika, 24, 95–112.
(2002). Comparison of the N300 and N400 ERPs to picture stimuli in congruent and incongruent contexts. Clinical Neurophysiology, 113, 1339–1350.
(1991). Design and analysis: A researcher’s handbook. Englewood Cliffs, NJ: Prentice Hall.
(2007). When emotional prosody and semantics dance cheek to cheek: ERP evidence. Brain Research, 1151, 107–118.
(2007). Audiovisual integration of emotional signals in voice and face: An event-related fMRI study. Neuroimage, 37, 1445–1456.
(1993). The network model of emotion: Motivational concerns. In , Advances in social cognition (pp. 109–133). Hillsdale, NJ: Erlbaum.
(2006). Emotion, motivation, and the brain: Reflex foundations in animal and human research. Progress in Brain Research, 156, 3–29.
(2004). Modulations of late event-related brain potentials in human by dynamic audiovisual speech stimuli. Neuroscience Letters, 372(1–2), 74–79.
(2000). Emotion circuits in the brain. Annual Review of Neuroscience, 23, 155–184.
(1994). Electrophysiological correlates of feature analysis during visual search. Psychophysiology, 31, 291–308.
(2005). Perceptual fusion and stimulus coincidence in the cross-modal integration of speech. The Journal of Neuroscience, 25, 5884–5893.
(2007). Embodying Emotion. Science, 316, 1002–1005.
(1996). Binaural fusion and the representation of virtual pitch in the human auditory cortex. Hearing Research, 100, 164–170.
(2006). Electrophysiological evidence on the processing of emotional prosody: Insights from healthy and patient populations (Max Planck Series in Human Cognitive and Brain Sciences, Vol. 71). Leipzig, Germany: Max Planck Institute for Human Cognitive and Brain Sciences.
(2006, August). Valence, arousal, and task effects on the P200 in emotional prosody processing. Poster presented at the 12th Annual Conference on Architectures and Mechanisms for Language Processing 2006 (AMLaP), Nijmegen, The Netherlands.
(2008a). Early emotional prosody perception based on different speaker voices. Neuroreport, 19, 209–213.
(2008b). An ERP investigation on the temporal dynamics of emotional prosody and emotional semantics in pseudo- and lexical-sentence context. Brain and Language, 105, 59–69.
(2009, April). Rapid processing of vocal emotion expressions as revealed by ERPs. Poster presented at the Evoked Potentials International Conference XV, Indiana University, IN, USA.
(2008). How aging affects the recognition of emotional speech. Brain and Language, 104, 262–269.
(2008, May). Rapid processing of emotional and voice information as evidenced by ERPs. In Speech Prosody 2008, Fourth Conference on Speech Prosody, Campinas, Brazil.
(in press ). Recognizing emotions in a foreign language. Journal of Nonverbal Behavior.2003). Neurobiology of emotion perception I: The neural basis of normal emotion perception. Biological Psychiatry, 54, 504–514.
(1977). Evoked potential audiometry. Journal of Otolaryngology, 6, 90–119.
(1999). Rapid emotional face processing in the human right and left brain hemispheres: An ERP study. Neuroreport, 10, 2691–2698.
(2000). The time-course of intermodal binding between seeing and hearing affective information. Neuroreport, 11, 1329–1333.
(2002). Facial expressions modulate the time course of long latency auditory brain potentials. Brain Research. Cognitive Brain Research, 14, 99–105.
(2006). Beyond the right hemisphere: Brain mechanisms mediating vocal emotion processing. Trends in Cognitive Science, 10, 24–30.
(2004). Functionally dissociated aspects in anterior and posterior electrocortical processing of facial threat. International Journal of Psychophysiology, 53, 29–36.
(2007). Neural correlates of multisensory integration of ecologically valid audiovisual events. Journal of Cognitive Neuroscience, 19, 1964–1973.
(1999). Brain responses indicate immediate use of prosodic cues in natural speech processing. Nature Neuroscience, 2, 191–196.
(2005). Visual speech speeds up the neural processing of auditory speech. Proceedings of the National Academy of Sciences, 102, 1181–1186.
(2000). The visual N1 component as an index of a discrimination process. Psychophysiology, 37, 190–203.
(2007). How iconic gestures enhance communication: An ERP study. Brain & Language, 101, 234–245.
(