Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming.

Standard

Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming. / Schneider, Till; Debener, Stefan; Oostenveld, Robert; Engel, Andreas K.

In: NEUROIMAGE, Vol. 42, No. 3, 3, 2008, p. 1244-1254.

Research output: SCORING: Contribution to journalSCORING: Journal articleResearchpeer-review

Harvard

APA

Vancouver

Bibtex

@article{bb4673bd533b48bdbe58068a9a2844b4,
title = "Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming.",
abstract = "An important step in perceptual processing is the integration of information from different sensory modalities into a coherent percept. It has been suggested that such crossmodal binding might be achieved by transient synchronization of neurons from different modalities in the gamma-frequency range (>30 Hz). Here we employed a crossmodal priming paradigm, modulating the semantic congruency between visual-auditory natural object stimulus pairs, during the recording of the high density electroencephalogram (EEG). Subjects performed a semantic categorization task. Analysis of the behavioral data showed a crossmodal priming effect (facilitated auditory object recognition) in response to semantically congruent stimuli. Differences in event-related potentials (ERP) were found between 250 and 350 ms, which were localized to left middle temporal gyrus (BA 21) using a distributed linear source model. Early gamma-band activity (40-50 Hz) was increased between 120 ms and 180 ms following auditory stimulus onset for semantically congruent stimulus pairs. Source reconstruction for this gamma-band response revealed a maximal increase in left middle temporal gyrus (BA 21), an area known to be related to the processing of both complex auditory stimuli and multisensory processing. The data support the hypothesis that oscillatory activity in the gamma-band reflects crossmodal semantic-matching processes in multisensory convergence sites.",
author = "Till Schneider and Stefan Debener and Robert Oostenveld and Engel, {Andreas K.}",
year = "2008",
language = "Deutsch",
volume = "42",
pages = "1244--1254",
journal = "NEUROIMAGE",
issn = "1053-8119",
publisher = "Academic Press",
number = "3",

}

RIS

TY - JOUR

T1 - Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming.

AU - Schneider, Till

AU - Debener, Stefan

AU - Oostenveld, Robert

AU - Engel, Andreas K.

PY - 2008

Y1 - 2008

N2 - An important step in perceptual processing is the integration of information from different sensory modalities into a coherent percept. It has been suggested that such crossmodal binding might be achieved by transient synchronization of neurons from different modalities in the gamma-frequency range (>30 Hz). Here we employed a crossmodal priming paradigm, modulating the semantic congruency between visual-auditory natural object stimulus pairs, during the recording of the high density electroencephalogram (EEG). Subjects performed a semantic categorization task. Analysis of the behavioral data showed a crossmodal priming effect (facilitated auditory object recognition) in response to semantically congruent stimuli. Differences in event-related potentials (ERP) were found between 250 and 350 ms, which were localized to left middle temporal gyrus (BA 21) using a distributed linear source model. Early gamma-band activity (40-50 Hz) was increased between 120 ms and 180 ms following auditory stimulus onset for semantically congruent stimulus pairs. Source reconstruction for this gamma-band response revealed a maximal increase in left middle temporal gyrus (BA 21), an area known to be related to the processing of both complex auditory stimuli and multisensory processing. The data support the hypothesis that oscillatory activity in the gamma-band reflects crossmodal semantic-matching processes in multisensory convergence sites.

AB - An important step in perceptual processing is the integration of information from different sensory modalities into a coherent percept. It has been suggested that such crossmodal binding might be achieved by transient synchronization of neurons from different modalities in the gamma-frequency range (>30 Hz). Here we employed a crossmodal priming paradigm, modulating the semantic congruency between visual-auditory natural object stimulus pairs, during the recording of the high density electroencephalogram (EEG). Subjects performed a semantic categorization task. Analysis of the behavioral data showed a crossmodal priming effect (facilitated auditory object recognition) in response to semantically congruent stimuli. Differences in event-related potentials (ERP) were found between 250 and 350 ms, which were localized to left middle temporal gyrus (BA 21) using a distributed linear source model. Early gamma-band activity (40-50 Hz) was increased between 120 ms and 180 ms following auditory stimulus onset for semantically congruent stimulus pairs. Source reconstruction for this gamma-band response revealed a maximal increase in left middle temporal gyrus (BA 21), an area known to be related to the processing of both complex auditory stimuli and multisensory processing. The data support the hypothesis that oscillatory activity in the gamma-band reflects crossmodal semantic-matching processes in multisensory convergence sites.

M3 - SCORING: Zeitschriftenaufsatz

VL - 42

SP - 1244

EP - 1254

JO - NEUROIMAGE

JF - NEUROIMAGE

SN - 1053-8119

IS - 3

M1 - 3

ER -