i see beats/cell phone sequencer

An idea from a few weeks ago, almost realized with Taeyoon Choi at OF Lab for the 5-word project “I Want To Play !”

http://vimeo.com/1781052

Cell phone light controls cells in a beat sequencer. Demo/intro near the beginning, hot beats around :50.

It’d be neat to get two cameras pointed in the same direction, one with a polarizer and one without, and then you can subtract them to tell where the (properly oriented) LCD screens are. That way the lights can stay on…

Also, Taeyoon’s idea was to drive the sounds with camera flashes instead, which would be more of a concert-augmentation thing.

Someone emailed me asking for source, so I rewrote the MIDI output into Andy Best’s ofxMidi and very slightly cleaned things up. See it here if you’re curious.

I’m very curious, i’ve seen it a bunch of times on vimeo :slight_smile:
the link above doesnt work…is there some chance for to post it again?

Wow, super delayed response – I’m sorry. The source should still be here http://rpi.edu/~mcdonk/random/iSeeBeats.zip

Also, a student named Jung-Im (???) just sent me a modified version that uses face tracking for pitch http://kr.blog.yahoo.com/fuke22/168.html?p=1&pm=l&tc=3&tt=1260600401

I’ve been wanting to do this with face tracking for beats instead of thresholding… but haven’t had a chance yet.