N.K.N. Aznan
Using Variable Natural Environment Brain-Computer Interface Stimuli for Real-time Humanoid Robot Navigation
Aznan, N.K.N.; Connolly, J.D.; Al Moubayed, N.; Breckon, T.P.
Authors
J.D. Connolly
Dr Noura Al Moubayed noura.al-moubayed@durham.ac.uk
Associate Professor
Professor Toby Breckon toby.breckon@durham.ac.uk
Professor
Abstract
This paper addresses the challenge of humanoid robot teleoperation in a natural indoor environment via a Brain-Computer Interface (BCI). We leverage deep Convolutional Neural Network (CNN) based image and signal understanding to facilitate both real-time object detection and dry-Electroencephalography (EEG) based human cortical brain bio-signals decoding. We employ recent advances in dry-EEG technology to stream and collect the cortical waveforms from subjects while they fixate on variable Steady State Visual Evoked Potential (SSVEP) stimuli generated directly from the environment the robot is navigating. To these ends, we propose the use of novel variable BCI stimuli by utilising the real-time video streamed via the on-board robot camera as visual input for SSVEP, where the CNN detected natural scene objects are altered and flickered with differing frequencies (10Hz, 12Hz and 15Hz). These stimuli are not akin to traditional stimuli - as both the dimensions of the flicker regions and their on-screen position changes depending on the scene objects detected. Onscreen object selection via such a dry-EEG enabled SSVEP methodology, facilitates the on-line decoding of human cortical brain signals, via a specialised secondary CNN, directly into teleoperation robot commands (approach object, move in a specific direction: right, left or back). This SSVEP decoding model is trained via a priori offline experimental data in which very similar visual input is present for all subjects. The resulting classification demonstrates high performance with mean accuracy of 85% for the real-time robot navigation experiment across multiple test subjects.
Citation
Aznan, N., Connolly, J., Al Moubayed, N., & Breckon, T. (2019, May). Using Variable Natural Environment Brain-Computer Interface Stimuli for Real-time Humanoid Robot Navigation. Presented at 2019 IEEE International Conference on Robotics and Automation (ICRA), Montreal, Canada
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | 2019 IEEE International Conference on Robotics and Automation (ICRA) |
Start Date | May 20, 2019 |
End Date | May 24, 2019 |
Acceptance Date | Jan 26, 2019 |
Online Publication Date | Aug 12, 2019 |
Publication Date | Aug 12, 2019 |
Deposit Date | Mar 4, 2019 |
Publicly Available Date | Nov 12, 2019 |
Pages | 4889-4895 |
Series ISSN | 2577-087X |
Book Title | 2019 International Conference on Robotics and Automation (ICRA) ; proceedings. |
DOI | https://doi.org/10.1109/icra.2019.8794060 |
Public URL | https://durham-repository.worktribe.com/output/1143241 |
Files
Accepted Conference Proceeding
(4.9 Mb)
PDF
Copyright Statement
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
Human neuroimaging reveals the subcomponents of grasping, reaching and pointing actions
(2017)
Journal Article
Coding of attention across the human intraparietal sulcus
(2015)
Journal Article
Representational content of occipitotemporal and parietal tool areas
(2015)
Journal Article
Non-obstructing 3D depth cues influence reach-to-grasp kinematics
(2014)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search