Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search

Standard

Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search. / Wahn, Basil; Schwandt, Jessika; Krüger, Matti; Crafa, Daina; Nunnendorf, Vanessa; König, Peter.

in: ERGONOMICS, Jahrgang 59, Nr. 6, 01.06.2016, S. 781-95.

Publikationen: SCORING: Beitrag in Fachzeitschrift/ZeitungSCORING: ZeitschriftenaufsatzForschungBegutachtung

Harvard

APA

Vancouver

Bibtex

@article{99cdf1752c48428ea788fe6868daa68f,
title = "Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search",
abstract = "In joint tasks, adjusting to the actions of others is critical for success. For joint visual search tasks, research has shown that when search partners visually receive information about each other's gaze, they use this information to adjust to each other's actions, resulting in faster search performance. The present study used a visual, a tactile and an auditory display, respectively, to provide search partners with information about each other's gaze. Results showed that search partners performed faster when the gaze information was received via a tactile or auditory display in comparison to receiving it via a visual display or receiving no gaze information. Findings demonstrate the effectiveness of tactile and auditory displays for receiving task-relevant information in joint tasks and are applicable to circumstances in which little or no visual information is available or the visual modality is already taxed with a demanding task such as air-traffic control. Practitioner Summary: The present study demonstrates that tactile and auditory displays are effective for receiving information about actions of others in joint tasks. Findings are either applicable to circumstances in which little or no visual information is available or when the visual modality is already taxed with a demanding task.",
author = "Basil Wahn and Jessika Schwandt and Matti Kr{\"u}ger and Daina Crafa and Vanessa Nunnendorf and Peter K{\"o}nig",
year = "2016",
month = jun,
day = "1",
doi = "10.1080/00140139.2015.1099742",
language = "English",
volume = "59",
pages = "781--95",
journal = "ERGONOMICS",
issn = "0014-0139",
publisher = "Taylor and Francis Ltd.",
number = "6",

}

RIS

TY - JOUR

T1 - Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search

AU - Wahn, Basil

AU - Schwandt, Jessika

AU - Krüger, Matti

AU - Crafa, Daina

AU - Nunnendorf, Vanessa

AU - König, Peter

PY - 2016/6/1

Y1 - 2016/6/1

N2 - In joint tasks, adjusting to the actions of others is critical for success. For joint visual search tasks, research has shown that when search partners visually receive information about each other's gaze, they use this information to adjust to each other's actions, resulting in faster search performance. The present study used a visual, a tactile and an auditory display, respectively, to provide search partners with information about each other's gaze. Results showed that search partners performed faster when the gaze information was received via a tactile or auditory display in comparison to receiving it via a visual display or receiving no gaze information. Findings demonstrate the effectiveness of tactile and auditory displays for receiving task-relevant information in joint tasks and are applicable to circumstances in which little or no visual information is available or the visual modality is already taxed with a demanding task such as air-traffic control. Practitioner Summary: The present study demonstrates that tactile and auditory displays are effective for receiving information about actions of others in joint tasks. Findings are either applicable to circumstances in which little or no visual information is available or when the visual modality is already taxed with a demanding task.

AB - In joint tasks, adjusting to the actions of others is critical for success. For joint visual search tasks, research has shown that when search partners visually receive information about each other's gaze, they use this information to adjust to each other's actions, resulting in faster search performance. The present study used a visual, a tactile and an auditory display, respectively, to provide search partners with information about each other's gaze. Results showed that search partners performed faster when the gaze information was received via a tactile or auditory display in comparison to receiving it via a visual display or receiving no gaze information. Findings demonstrate the effectiveness of tactile and auditory displays for receiving task-relevant information in joint tasks and are applicable to circumstances in which little or no visual information is available or the visual modality is already taxed with a demanding task such as air-traffic control. Practitioner Summary: The present study demonstrates that tactile and auditory displays are effective for receiving information about actions of others in joint tasks. Findings are either applicable to circumstances in which little or no visual information is available or when the visual modality is already taxed with a demanding task.

U2 - 10.1080/00140139.2015.1099742

DO - 10.1080/00140139.2015.1099742

M3 - SCORING: Journal article

C2 - 26587687

VL - 59

SP - 781

EP - 795

JO - ERGONOMICS

JF - ERGONOMICS

SN - 0014-0139

IS - 6

ER -