Abstract
Zusammenfassung: Mit Hilfe der elektrischen Reizung des Hörnervs durch implantierte Elektroden in die Hörschnecke (Cochlea-Implantat: CI) kann bei Ertaubten die Fähigkeit zur Wahrnehmung akustischer Sprachlaute wieder hergestellt werden. Aufgrund der herabgesetzten akustische Qualität der Signale ziehen diese zusätzliche visuelle Informationen heran. Akustische Sprachreize (zweisilbige Substantive) wurden zeitgleich zu einem Video-Segment mit dem Gesicht des Sprechers dargeboten, das entweder dem akustischen Wort kongruente (z. B. Audio: Hotel, Video: Hotel) oder inkongruente Information (z. B. Audio: Hotel, Video: Insel) aussprach. Die Analyse der Verhaltensdaten ergab, dass CI-Patienten deutlich von der zusätzlichen Darbietung des Sprechergesichtes profitieren, um Sprachlaute zu verstehen. Auch Normalhörende nutzen visuelle Informationen, vor allem, wenn die akustischen Signale verrauscht und schwer verständlich sind. Die audiovisuelle Sprachverarbeitung löst bei CI-Nutzern und Normalhörenden unterschiedliche Amplitudenverläufe im ereigniskorrelierten Potenzial aus. Dabei zeigen sich vor allem Unterschiede im okzipitalen Bereich, was als Reorganisation nach Hördeprivation bei CI-Patienten verstanden werden kann.
Abstract: Through electrical stimulation by means of implanted electrodes within the cochlea (Cochlea implant: CI) deafened patients can re-establish the ability of speech perception. Because the auditory information in CI patients is impoverished, they rely on additional visual information from lip movements of the talker's face. In this study we presented words acoustically and a video of the talker's face simultaneously. Congruency of the auditory and visual information was manipulated (i.e. congruent pair: Audio: Hotel, Video: Hotel; incongruent: Audio: Hotel, Video: Island). CI patients benefit from additional matched visual information indicated by an increased comprehension rate in congruent trials. Normal hearing control subjects also use visual information whenever noise is added to the audio-signals. Event-related brain potentials revealed the neural correlates of crossmodal integration of audio and visual information. Moreover, the different distribution of event-related brain potentials in patients and controls suggests cortical reorganisation after hearing deprivation in CI-users.
References
(1998). Synthetic faces as a lipreading support. Australia, Sydney: In Proceedings of ICSLP'98, pp. 3047- 3050.
(2002). Auditory deprivation affects processing of motion, but not color. Cognitive Brain Research, 14, 422–34.
(1993). The CELEX lexical database (CD-ROM). Philadelphia, PA: Linguistic Data Consortium, University of Pennsylvania.
(1960). Experiments in hearing. New York: McGraw-Hill.
(2004). Bimodal speech: Early suppressive visual effects in human auditory cortex. European Journal of Neurosciences, 20, 2225–34.
(1996). Cochlear factors affecting auditory performance of postlingually deaf adults cochlear implants. Audiology and Neuro-Otology, 1, 293–306.
(2001). Multimodal contribution of speech perception revealed by independent component analysis: A single-sweep EEG case study. Cognitive Brain Research, 10, 349–53.
(1998). Crossmodal identification. Trends in Cognitive Sciences, 2(7), 247–53.
(1997). Activation of auditory cortex during silent lipreading. Science, 276, 593–96.
(1976). Implication of multiple intracochlear electrodes for rehabilitation of total deafness: Preliminary report. Laryngoscope, 86, 1743–1751.
(1997). Cochlear implant soft surgery: Fact or fantasy?. Otolaryngology Head and Neck Surgery., 117, 214–216.
(2002). Electrophysiology of spatial scene analysis: The mismatch negativity (MMN) is sensitive to the ventriloquism illusion. Clinical Neurophysiology, 113, 507–518.
(2000). Perceiving emotions by ear and by eye. Cognition & Emotion, 14, 289–311.
(1957). Prothèse auditive par excitation éléctrique du nerf sensoriel à l'aide d'un bobinage inclus à demeure. Presse Med, 35, 14–7.
(1977). The role of the vision in the perception of speech. Perception, 6, 31–40.
(2003). Cortical operational synchrony during audio-visual speech integration. Brain and Language, 85, 297–312.
(1998). The influence of quality of information on the McGurk effect. Online unter kiri.ling.cam.ac.uk/sarah/docs/fixmer-hawkins-AVSP98.
(1995). Other minds in the brain: A functional imaging study of “theory of mind” in story comprehension. Cognition, 57, 109–28.
(1994). Results of multichannel cochlear implants in congenital and acquired prelingual deafness in children: Five year follow-up. The American Journal of Otology, 15(Suppl. 2), 1–7.
(1999). Auditory-visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study. Journal of Cognitive Neuroscience, 11, 473–490.
(2000). Differential recruitment of the speech comprehension system in healthy subjects and rehabilitated cochlear implant patients. Brain, 123, 1391–1402.
(2000). The use of visible speech cues for improving auditory detection of spoken sentences. Journal of the Acoustical Society of America, 108, 1197–1208.
(1997). Auditory and auditory-visual perception of clear and conversational speech. Journal of Speech, Language and Hearing Research, 40, 432–443.
(1997). Evaluation of performance with the COMBI40 cochlear implant in adults: A multicentric clinical study. ORL Journal for Oto-Rhino-Laryngology and its Related Specialties, 59, 23–35.
(1997). Cochlear implantation: A review of the literature and the Nijmegen results. Journal of Laryngology and Otology, 111, 1008–17.
(1994). Bimodal speech perception across the lifespan. In D.J. Lewkowicz & R. Lickliter (Eds.), The development of intersensory perception: Comparative perspectives (pp. 371-399). Hillsdale, NJ: Erlbaum.
(1987). Quantifying the contribution of vision to perception in noise. British Journal of Audiology, 21, 131–141.
(1996). Cochlear reimplantation. 3rd European symposium on pediatric cochlear implantation in Hannover, Abstract 66.
(2002). Multisensory auditory-visual interactions during early sensory processing in humans: A high-density electrical mapping study. Cognitive Brain Research, 14(1), 115–128.
(1997). Current issues in cochlear implants in children. The Hearing Review, Oct., 28–31.
(2000). Cognitive factors and cochlear implants: Some thoughts on perception, learning, and memory in speech perception. Ear & Hearing, 21, 70–78.
(2002). Neurocognitive basis for audiovisual speech perception: Evidence from event-related potentials. Proceedings of the 7th International Congress of Spoken Language Processing. Denver, CO, S. 1697-1700.
(2000). Audiovisual integration of letters in the human brain. Neuron, 28, 617–625.
(1993). Auditory compensation for early blindness in cat cerebral cortex. Journal of Neuroscience, 13, 4538–48.
(1987). Easy to hear but hard to understand: A lip-reading advantage with intact auditory stimuli. In R. Campbell & B. Dodd (Eds.), Hearing by eye: The psychology of lipreading (pp. 97-114). Hillsdale, NJ: Erlbaum.
(1999). Pseudospontaneous activity: Stochastic independence of auditory nerve fibers with electrical stimulation. Hearing Research, 127, 108–118.
(1991). Seeing speech: Visual information from lip movements modifies activity in the human auditory cortex. Neuroscience Letters, 127, 141–145.
(1954). Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America, 26, 212–15.
(1992). Lipreading and audio-visual speech perception. Philosophical transactions of the Royal Society of London. Series B: Biological Sciences, 335, 71–78.
(2002). An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. Cognitive Brain Research, 14, 106–114.
1997a). Speech perception by prelingually deaf children and postlingually deaf adults with cochlear implant. Scandinavian Audiology Supplement, 46, 65–71.
(1997b). Speech perception by prelingually deaf children using cochlear implants. Otolaryngology Head and Neck Surgery, 117, 180–187.
((1984). Postmasking effects of sensorineural tinnitus: A preliminary investigation. Journal of Speech and Hearing Research, 27, 466–474.
(2001). Recognition memory correlates of hippocampal theta cells. Journal of Neuroscience, 21, 3955–67.
(2003). Electrophysiology of auditory-visual speech integration. Proceedings of the audio-visual speech Processing, St Jorioz, France, 37-42.