Prediction of attentional focus from respiration with simple feed-forward and time delay neural networks

  • Michael Christopher Melnychuk
  • Peter R. Murphy
  • Ian H. Robertson
  • Joshua H. Balsters
  • Paul M. Dockree

Abstract

Current methods to infer an agent’s state of attentional focus rely on scalp potential recordings and pupil diameter measurements, both of which are unrealistic in many real-world situations, and are also prone to movement and electrical artifacts. Being able to predict attentional performance from a simple and noninvasive measure, such as respiration, could have obvious potential benefit for simplifying measurement and improving task performance in many settings, and could also be employed clinically with attentionally compromised populations for training and rehabilitation. It has been suggested that respiration and attention comprise a neuro-physiologically coupled system, and behavioral data has indicated that attentional performance, including reaction time and reaction time variability (RTV), covary with respiratory dynamics. In the present study, we tested several neural network configurations for the prediction of attentional control state (RTV) from respiratory parameters. We observed significant predictive power derived solely from respiratory input, and conclude that a robust and portable feedback device utilizing soft computation is feasible for this purpose. We suggest specific model and data source improvements to potentially further reduce errors in prediction.

Bibliografische Daten

OriginalspracheEnglisch
ISSN0941-0643
DOIs
StatusVeröffentlicht - 01.09.2020

Anmerkungen des Dekanats

Funding Information:
MM and PD were supported by Irish Research Council Laureate Grant: 201911.

Publisher Copyright:
© 2020, Springer-Verlag London Ltd., part of Springer Nature.

Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.