Please use this identifier to cite or link to this item: https://doi.org/10.21256/zhaw-22244
Publication type: Article in scientific journal
Type of review: Peer review (publication)
Title: Closed-loop EEG study on visual recognition during driving
Authors: Aydarkhanov, Ruslan
Ušćumlić, Marija
Chavarriaga, Ricardo
Gheorghe, Lucian
Millán, José del R
et. al: No
DOI: 10.1088/1741-2552/abdfb2
10.21256/zhaw-22244
Published in: Journal of Neural Engineering
Volume(Issue): 18
Issue: 2
Pages: 026010
Issue Date: 2021
Publisher / Ed. Institution: IOP Publishing
ISSN: 1741-2552
1741-2560
Language: English
Subjects: Brain-computer interface; Driving; Electroencephalography; Eye tracking; Visual recognition
Subject (DDC): 150: Psychology
Abstract: Objective. In contrast to the classical visual brain–computer interface (BCI) paradigms, which adhere to a rigid trial structure and restricted user behavior, electroencephalogram (EEG)-based visual recognition decoding during our daily activities remains challenging. The objective of this study is to explore the feasibility of decoding the EEG signature of visual recognition in experimental conditions promoting our natural ocular behavior when interacting with our dynamic environment. Approach. In our experiment, subjects visually search for a target object among suddenly appearing objects in the environment while driving a car-simulator. Given that subjects exhibit an unconstrained overt visual behavior, we based our study on eye fixation-related potentials (EFRPs). We report on gaze behavior and single-trial EFRP decoding performance (fixations on visually similar target vs. non-target objects). In addition, we demonstrate the application of our approach in a closed-loop BCI setup. Main results. To identify the target out of four symbol types along a road segment, the BCI system integrated decoding probabilities of multiple EFRP and achieved the average online accuracy of 0.37 ± 0.06 (12 subjects), statistically significantly above the chance level. Using the acquired data, we performed a comparative study of classification algorithms (discriminating target vs. non-target) and feature spaces in a simulated online scenario. The EEG approaches yielded similar moderate performances of at most 0.6 AUC, yet statistically significantly above the chance level. In addition, the gaze duration (dwell time) appears to be an additional informative feature in this context. Significance. These results show that visual recognition of sudden events can be decoded during active driving. Therefore, this study lays a foundation for assistive and recommender systems based on the driver's brain signals.
URI: https://digitalcollection.zhaw.ch/handle/11475/22244
Fulltext version: Accepted version
License (according to publishing contract): Licence according to publishing contract
Restricted until: 2022-02-26
Departement: School of Engineering
Organisational Unit: Centre for Artificial Intelligence (CAI)
Institute of Applied Information Technology (InIT)
Appears in collections:Publikationen School of Engineering

Files in This Item:
File Description SizeFormat 
2021_Aydarkhanov-etal_Closed-loop-EEG-study-visual-recognition-driving.pdf
  Until 2022-02-26
Accepted Version3.29 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.