Project Overview
The system decodes motor intent from EEG data, trains a CSP + LDA classifier, and streams live predictions with confidence-weighted feedback.
A real-time brain-computer interface pipeline that classifies grab vs throw motor actions from EEG signals with offline training and live prediction.
The system decodes motor intent from EEG data, trains a CSP + LDA classifier, and streams live predictions with confidence-weighted feedback.
Training covers EEG preprocessing, epoching around grab and throw markers, CSP feature extraction, calibrated LDA fitting, and model artifact export.
The online script connects to LSL streams, maintains a rolling EEG buffer, and emits predictions during the analysis window for live feedback.