Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination
Standard
Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination. / Hwang, Tong-Hun; Schmitz, Gerd; Klemmt, Kevin; Brinkop, Lukas; Ghai, Shashank; Stoica, Mircea; Maye, Alexander; Blume, Holger; Effenberg, Alfred O.
in: FRONT PSYCHOL, Jahrgang 9, 2018, S. 404.Publikationen: SCORING: Beitrag in Fachzeitschrift/Zeitung › SCORING: Zeitschriftenaufsatz › Forschung › Begutachtung
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - JOUR
T1 - Effect- and Performance-Based Auditory Feedback on Interpersonal Coordination
AU - Hwang, Tong-Hun
AU - Schmitz, Gerd
AU - Klemmt, Kevin
AU - Brinkop, Lukas
AU - Ghai, Shashank
AU - Stoica, Mircea
AU - Maye, Alexander
AU - Blume, Holger
AU - Effenberg, Alfred O
PY - 2018
Y1 - 2018
N2 - When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
AB - When two individuals interact in a collaborative task, such as carrying a sofa or a table, usually spatiotemporal coordination of individual motor behavior will emerge. In many cases, interpersonal coordination can arise independently of verbal communication, based on the observation of the partners' movements and/or the object's movements. In this study, we investigate how social coupling between two individuals can emerge in a collaborative task under different modes of perceptual information. A visual reference condition was compared with three different conditions with new types of additional auditory feedback provided in real time: effect-based auditory feedback, performance-based auditory feedback, and combined effect/performance-based auditory feedback. We have developed a new paradigm in which the actions of both participants continuously result in a seamlessly merged effect on an object simulated by a tablet computer application. Here, participants should temporally synchronize their movements with a 90° phase difference and precisely adjust the finger dynamics in order to keep the object (a ball) accurately rotating on a given circular trajectory on the tablet. Results demonstrate that interpersonal coordination in a joint task can be altered by different kinds of additional auditory information in various ways.
KW - Journal Article
U2 - 10.3389/fpsyg.2018.00404
DO - 10.3389/fpsyg.2018.00404
M3 - SCORING: Journal article
C2 - 29651263
VL - 9
SP - 404
JO - FRONT PSYCHOL
JF - FRONT PSYCHOL
SN - 1664-1078
ER -