The Stranger Installation (+ ofxOpenNIOSC )

Hello guys,
I want to share the last experimental installation I just finished, in partnership with Robert Diel. I’m a French designer and digital artist and I spent a few months in Chicago working at Digital Kitchen. I was working closely with Robert (Creative Technologist at DK) and we initiated an experimental project in-house.
Here it is : https://vimeo.com/57136062
and an article on The Creators Project http://thecreatorsproject.com/blog/online-anonymity-is-explored-in-this-creepy-interactive-installation

Here is the system we developed : http://www.flickr.com/photos/92090073@N06/8367028687/

We started developing a system for physical computing allowing us to have a full body input in a set Kinect depth box to control sound and visuals for the projection. We also wanted a system connected to social networks (Facebook, Twitter)… or our own webapp (PHP/HTML/JavaScript) for live input.
Beforehand, I was already playing with Synapse by Ryan Challinor, driving Kinect joints information to Ableton Live (or any OSC receiver) to control music tracks or live visuals via Quartz, for example.
The main issue was the level of immersion needed for this installation. A giant head, pulling live social information about you (or the keywords you set on the webapp) when you approaches, and constantly staring at you wherever you are in space needed a perfect Kinect tracking. Synaspe wasn’t good enough.
The head is running live in Unity3D, moving and rotating in space according to the x,y,z position of the Head joint of the Kinect skeleton. Synapse being too slow and requiring a long pose standing in front of the Kinect to start tracking the user, we switched for the experimental branch of OpenNI.
OpenNI has a better tracking and more importantly does not require a pose. The user enters the detection zone and is immediately recognized, which was a huge plus in terms of immersion. It required lots of calibration in Unity to have the 3D model actually staring at you wherever you move, but we got it working perfectly.

OpenFrameworks pulls social network statuses and names for the text-to-speech capability (ofxOpenNI, ofxJSON, ofxOSC, ofxSpeech) but for a powerful audiovisual experience we wanted to use Ableton Live for the sound and Unity3D for the visual.
Robert developed ofxOpenNIOSC ( https://github.com/robertdiel/ofxOpenNIOSC ) that takes the selected Kinect skeleton info from ofxOpenNI and sends it out via OSC. By setting an OSC receiver in Ableton (Max for Live) and Unity3D, we got full Kinect control, all running on the same MacPro with no visible latency.
Robert kindly did some extra work, cleaned his code and released it as an addon for the oF community. So please try it and give us feedback.

  • Robert is full time Creative Technologist at Digital Kitchen Chicago.
  • I was Creative apprentice at DK for 9 months, specializing in Innovation and physical computing. I’m now back in Europe. You can have a look at my (kind of old now) reel here https://vimeo.com/35668760
    or get in touch via email maxenceparache@gmail.com

Thanks everyone, hope you liked the project.

2 Likes

This is very cool, thanks for sharing. Sounds like a very complex system to get all working at the same time and it’s great to have people making cool projects and sharing out code from them as well.