Coordinating With a Robot Partner Affects Neural Processing Related to Action Monitoring

  • Artur Czeszumski (Geteilte/r Erstautor/in)
  • Anna L. Gert (Geteilte/r Erstautor/in)
  • Ashima Keshava (Geteilte/r Erstautor/in)
  • Ali Ghadirzadeh
  • Tilman Kalthoff
  • Benedikt V. Ehinger
  • Max Tiessen
  • Mårten Björkman
  • Danica Kragic
  • Peter König

Abstract

Robots start to play a role in our social landscape, and they are progressively becoming responsive, both physically and socially. It begs the question of how humans react to and interact with robots in a coordinated manner and what the neural underpinnings of such behavior are. This exploratory study aims to understand the differences in human-human and human-robot interactions at a behavioral level and from a neurophysiological perspective. For this purpose, we adapted a collaborative dynamical paradigm from the literature. We asked 12 participants to hold two corners of a tablet while collaboratively guiding a ball around a circular track either with another participant or a robot. In irregular intervals, the ball was perturbed outward creating an artificial error in the behavior, which required corrective measures to return to the circular track again. Concurrently, we recorded electroencephalography (EEG). In the behavioral data, we found an increased velocity and positional error of the ball from the track in the human-human condition vs. human-robot condition. For the EEG data, we computed event-related potentials. We found a significant difference between human and robot partners driven by significant clusters at fronto-central electrodes. The amplitudes were stronger with a robot partner, suggesting a different neural processing. All in all, our exploratory study suggests that coordinating with robots affects action monitoring related processing. In the investigated paradigm, human participants treat errors during human-robot interaction differently from those made during interactions with other humans. These results can improve communication between humans and robot with the use of neural activity in real-time.

Bibliografische Daten

OriginalspracheEnglisch
Aufsatznummer686010
ISSN1662-5218
DOIs
StatusVeröffentlicht - 11.08.2021

Anmerkungen des Dekanats

Funding Information:
We gratefully acknowledge the support by the European Commission Horizon H2020-FETPROACT-2014 641321-socSMCs, Deutsche Forschungsgemeinschaft (DFG) funded

Funding Information:
We would like to thank all partners in the socSMC consortium. Especially, we thank Alfred O. Effenberg and Tong-Hun Hwang for providing the sonification algorithm, and the help with implementing it. Funding. We gratefully acknowledge the support by the European Commission Horizon H2020-FETPROACT-2014 641321-socSMCs, Deutsche Forschungsgemeinschaft (DFG) funded Research Training Group Situated Cognition (GRK 2185/1), Nieders?chsischen Innovationsf?rderprogramms f?r Forschung und Entwicklung in Unternehmen (NBank)?EyeTrax, the German Federal Ministry of Education and Research funded project ErgoVR-16SV8052 and the DFG Open Access Publishing Fund of Osnabr?ck University. We acknowledge the support of Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy?EXC 2075?390740016 for BE.

Publisher Copyright:
© Copyright © 2021 Czeszumski, Gert, Keshava, Ghadirzadeh, Kalthoff, Ehinger, Tiessen, Björkman, Kragic and König.