N.K.N. Aznan
Simulating Brain Signals: Creating Synthetic EEG Data via Neural-Based Generative Models for Improved SSVEP Classification
Aznan, N.K.N.; Atapour-Abarghouei, A.; Bonner, S.; Connolly, J.D.; Al Moubayed, N.; Breckon, T.P.
Authors
Dr Amir Atapour-Abarghouei amir.atapour-abarghouei@durham.ac.uk
Assistant Professor
S. Bonner
J.D. Connolly
N. Al Moubayed
Professor Toby Breckon toby.breckon@durham.ac.uk
Professor
Abstract
Despite significant recent progress in the area of Brain-Computer Interface (BCI), there are numerous shortcomings associated with collecting Electroencephalography (EEG) signals in real-world environments. These include, but are not limited to, subject and session data variance, long and arduous calibration processes and predictive generalisation issues across different subjects or sessions. This implies that many downstream applications, including Steady State Visual Evoked Potential (SSVEP) based classification systems, can suffer from a shortage of reliable data. Generating meaningful and realistic synthetic data can therefore be of significant value in circumventing this problem. We explore the use of modern neural-based generative models trained on a limited quantity of EEG data collected from different subjects to generate supplementary synthetic EEG signal vectors, subsequently utilised to train an SSVEP classifier. Extensive experimental analysis demonstrates the efficacy of our generated data, leading to improvements across a variety of evaluations, with the crucial task of cross-subject generalisation improving by over 35% with the use of such synthetic data.
Citation
Aznan, N., Atapour-Abarghouei, A., Bonner, S., Connolly, J., Al Moubayed, N., & Breckon, T. (2019, December). Simulating Brain Signals: Creating Synthetic EEG Data via Neural-Based Generative Models for Improved SSVEP Classification. Presented at International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | International Joint Conference on Neural Networks (IJCNN) |
Acceptance Date | Mar 7, 2019 |
Publication Date | Jul 14, 2019 |
Deposit Date | Mar 25, 2019 |
Publicly Available Date | Nov 13, 2019 |
Pages | 1-8 |
Series ISSN | 2161-4407 |
Book Title | 2019 International Joint Conference on Neural Networks (IJCNN) ; proceedings |
DOI | https://doi.org/10.1109/ijcnn.2019.8852227 |
Public URL | https://durham-repository.worktribe.com/output/1144574 |
Related Public URLs | arXiv:1901.07429 |
Files
Accepted Conference Proceeding
(1.6 Mb)
PDF
Copyright Statement
© 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
You might also like
HINT: High-quality INpainting Transformer with Mask-Aware Encoding and Enhanced Attention
(2024)
Journal Article
INCLG: Inpainting for Non-Cleft Lip Generation with a Multi-Task Image Processing Network
(2023)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search