I built an EEG (brain wave) drawing demo and exhibited at DIYBio/Hacklab.to yesterday. I use Muse, a consumer-grade EEG headset, which cannot read your mind but at least can tell how focused you are. During a one-minute training session, people are asked to do several activities to get samples with different focus levels, and then FFT-ed samples are mapped on a 2D plane as dots, inspired by @kylemcdonald 's [nice t-SNE demo] ( https://vimeo.com/135511186 ). After the training session, real-time data is mapped on the same plane but this time they form lines - in short, the participant can shift their focus level to move where lines are drawn. Surprisingly most of the participants understood the interaction between their focus level and visual, which is an encouraging result.
I will bring it to Axon Art collective so come see us if you’re in Montreal!
(Data acquisition / signal processing / machine learning is done on Python, and I used ofxNumpy and OSC to read data on openFrameworks)