Ref. U+262F Interactive augmented sculpture

We took part of the installation entitled “Ref.U+262F”, with french video-mapping collective Paradigme, we build the interactive part, for which we created an OpenFrameworks app with the help of a few addons.
Here is a video report :

Virtual water was made of particles that evolve slowly and collide with the environment and the monolith. When people scan their words in the terminals, the same word is poped into the virtual water and then move slowly around the monolith. Using a shader program we gave them a movement close to a jellyfish. When every word gets scanned differents precalculated animations were played on the four sides of the monolith and are mixed with the generative water on the ground.
We built an “edit mode” where artists were able to visually modify pathes of the words and tweak many parameters.
We used TUIO to know the chosen words and their positions in space. The app intensively uses OSC messages to communicate real-time informations to other computers and give information to the generative music, and water particles were tweaked in realtime depending on people’s choices. Results and words chosen by the public were stored in a SQLite database for later statistical studies.

Thanks to OF and addons developpers, it was a great pleasure to work with such a good framework !

Addons used : ofxCsv, ofxOsc, ofxSQLiteCpp, ofxSuperLog, ofxSyphon, ofxTimeMeasurements, ofxTuio, ofxUI.

Lozange Lab - http://www.lozange-lab.com

Sympa !

Did you use OCR for scanning the names ? Or you had QR code or something at the back of the paper ?

(Is Metz big with interactive art now ? Wish I knew of your collective a few years ago :slight_smile: )

Merci :wink:
You lived in Metz a few years ago ? It’s not big yet on interactive art, but starting to get a few interesting things.
We did not used OCR but QR codes at the back of the paper yep.