The goal of this project is to create simple
Python-based HCI (specifically BCI) musical instrument by integrating the input from an external
EEG sensor with a
MIDI-based music-making algorithm. The EEG sensor, a single electrode placed on the forehead, records fluctuations of electric fields through the skull caused by large-scale changes in brain activity. This data is transmitted to the computer wirelessly as a time series of recordings, and this signal is then decomposed into frequency bands. An algorithm will be employed to determine which, if any, activities will produce consistent and unique signals (for example: blinking/thinking/focussing). Once calibrated, the user will play the device by reproducing these activities, giving rise to signals which will trigger MIDI notes played through the virtual synthesizer.
The source code for this project can be found
here.
The a.py script runs an instance of the project.