Temporal Coupling of Brain Signals and Fine Motor Output Using Affordable EEG Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.1109/access.2025.3587262
· OA: W4412164034
This study introduces a novel publicly accessible EEG dataset, acquired through an evoked motor execution paradigm, with a primary objective to examine the relationship between frequency activity of the sensorimotor cortex and index finger movements. A primary and distinctive contribution of this work is our precise temporal integration methodology that synchronizes EEG signals, task-related visual and auditory cues, and exact muscle activation events (finger trigger presses) through the Lab Streaming Layer protocol. This multimodal approach enhances dataset quality by enabling millisecond-precise alignment between stimulus presentation, cortical activation patterns, and resultant motor behavior—a critical feature lacking in most publicly available motor-related EEG datasets. A group consisting of 26 participants completed a total of 127 experimental sessions, during which EEG signals were simultaneously recorded along with corresponding trigger presses. Further underscoring the dataset’s broad accessibility, all recordings were performed using an affordable, widely available EEG headset device. We demonstrate the dataset’s practical applicability for EEG-based motor decoding by developing and evaluating deep learning classification models, highlighting dataset suitability despite using an affordable EEG recording headset. A real-time demonstration validated our approach in a virtual environment, where the highest-performing model successfully translated EEG signals into predicted finger movements. Our dataset, the detailed experimental methods for synchronized multimodal recording, as well as the codebase for data preprocessing and deep-learning model training, are all publicly available and open-source. The dataset is publicly available online. Codebase is published at github.com/nemesgyadam/NeuroJoystickVault