Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources

Standard

Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources. / Wahn, Basil; Murali, Supriya; Sinnett, Scott; König, Peter.

in: I-PERCEPTION, Jahrgang 8, Nr. 1, 17.02.2017, S. 2041669516688026.

Publikationen: SCORING: Beitrag in Fachzeitschrift/ZeitungSCORING: ZeitschriftenaufsatzForschungBegutachtung

Harvard

Wahn, B, Murali, S, Sinnett, S & König, P 2017, 'Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources', I-PERCEPTION, Jg. 8, Nr. 1, S. 2041669516688026. https://doi.org/10.1177/2041669516688026

APA

Vancouver

Bibtex

@article{a54f2e94fbe14e039f0b11ae9c6480f1,
title = "Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources",
abstract = "Humans' ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.",
keywords = "Journal Article",
author = "Basil Wahn and Supriya Murali and Scott Sinnett and Peter K{\"o}nig",
year = "2017",
month = feb,
day = "17",
doi = "10.1177/2041669516688026",
language = "English",
volume = "8",
pages = "2041669516688026",
journal = "I-PERCEPTION",
issn = "2041-6695",
publisher = "Pion Ltd.",
number = "1",

}

RIS

TY - JOUR

T1 - Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources

AU - Wahn, Basil

AU - Murali, Supriya

AU - Sinnett, Scott

AU - König, Peter

PY - 2017/2/17

Y1 - 2017/2/17

N2 - Humans' ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.

AB - Humans' ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task-dependent and suggests that multisensory benefits are not dependent on attentional resources.

KW - Journal Article

U2 - 10.1177/2041669516688026

DO - 10.1177/2041669516688026

M3 - SCORING: Journal article

C2 - 28203353

VL - 8

SP - 2041669516688026

JO - I-PERCEPTION

JF - I-PERCEPTION

SN - 2041-6695

IS - 1

ER -