Stop Motion

Stop-Motion

photo installation by Ole-Kristensen and Jonas-Jongejan.

this installation was initially made in processing for a large screen on the copenhagen central square (the screen itself was frightenly brigt and amazing), then on to New Media Meeting 03, and further to a festival in Copenhagen.

however, we decided to learn openFrameworks as the java was to slow on memory and disc operations, and only offered us 2gb of memory on our 16 gb mac pro.

with openframework we made it spin on a little white plastic macbook

http://3xw.ole.kristensen.name/works/stop-motion
has videos and source code

dSLR tethering blues
as we wanted to capture the images from a dslr, there was a lot of technical difficulties with the different tethering peculiarities of canon and nikon. none of the big brands seemed to support the industry standard tethering protocol. they seem to focus their firmware on downloading batches of images. remote capture was quite a maze.

we ended up hacking something together by scripting the capture in applescipt, triggering their proprietary software to capture and download a file into a folder, that had a folder action (apple stuff) to scale images, and then copy the scaled down image (1:1 pixel with screen resolution) to the of data folder.

tricky little thing. but it worked.

This is wonderful! For two years now I’ve been wanting to do a project that would be like a mashup between this and Ocean-v1. Instead of a light, it would track your face position + expression. Also, I want to not just separate the participants over time, but in space: two locations would be networked and you would see someone from the other location. Position tracking is really easy, but expression tracking has lots of proprietary techniques wrapped up in it… so this project has been on hold :frowning:

You mention you use 16GB memory. How many images were you working with? Do you have a smart search algorithm (e.g.: quadtree) or just do a linear scan?

Hi Kyle

Thanks for the response.

We have acutally also been talking about the ideas you got… Tracking a face, or a full body… But it was to complicated for the original setup.

With regards to the ram, the problem was that processing was very slow in loading a jpg directly from harddrive, and 2gig of uncompressed images is not very much. But this is no problem in oF since the image loader is muvh much faster, so it runs now directly from harddrive.

Hey, that’s great to hear that other people are thinking of this kind of setup! If we can work out the logistics of a co-located version, it would be great to do something between Denmark and the US :slight_smile:

Hey nice project!

I am very curious about the tethering issues - did you try using the Canon SDK at all?

I was thinking of getting a DSLR for doing remote triggering and assumed the Canon would be the way to go with the C++ api.

Also the unofficial CHDK could be useful as it adds a lot of functionality to Canon cameras. http://chdk.wikia.com/wiki/CHDK-in-Brief

Theo

Hi Theo

I haven’t found any official Canon Mac SDK?! Only a little stupid remote control app that we could remote control with a keystroke and a applescript… But my canon G9 camera does receive standard ptp commands… But when i send a capture signal to the camera, it has to take the lens out, take the picture, and take the lens in again… A very long process… And the app from canon can leave the lens out… So i think it’s some undocumented ptp commands canon uses themself, but i can’t find them…

The alternative firmware seems very interesting! Thanks for the link.

Here they list mac sdks for both eos and powershot cameras - http://www.usa.canon.com/consumer/contr-…-3464#SDKQ1

Not sure if they are the same as the windows ones - but it looks like they are cross-platform. I am pretty sure that you get much better control with the SDKs.

Theo

Ahh yes, found that page too some time ago… But as european i need to registrer, and they have never replied to my request…

Also check out PS Remote, has a dll you can interface using c++ that lets you control certain cams. Have used it with Processing but not ofw yet.

http://www.breezesys.com/PSRemote/index.htm

hey kyle

marvellous, i’ve had the same idea for some years - the stop motion was a take on simplifying it. and it turned out quite cute with the light bulb (going to be outlawed soon, though :D)

expressionbased emotional tracking is not impossible, but so far it cannot be done using commodity hardware, as you need a good hardware eyetracker. as a matter of fact one of my friends has developed such a system commercially - http://www.imotions.dk/ (of course he had to finance the development by selling it to advertisers :confused: - but his outset is far more interesting he wants to use it for art and storytelling, interactive movies etc.)

however, i guess that the eyetracker is mostly for pinpointing gaze direction, and might not be needed for facial expressions.

and - as allways - proprietary tech makes me tired just by the thought of it. that’s how i’m wired.

i’m quite busy these days, but i guess a good starting point would be to at least get these properties : face x,y pos , size, z rotation, perhaps eye locations.

how to sync images via network and setting up a database server would be for later - i’d love it to be something that could work in a peer-to peer like fashion, with newly captured images (tagged with their properties) getting exchanged without the need for a server for anything else than a list of client ips and the occasional proxying. it would be an interesting installation to set up, especially if it could run in more than two locations.

best / ole

Hey Ole, this is exciting to see even more people thinking about this :slight_smile:

There is some great research at the school I attend involving stereo-camera eyetracking: . I’ve tried this demo, and it’s incredibly accurate. Like +/-5 pixels jitter.

I don’t think eye tracking is as important though. I think the two things that are most important are 1) tracking general orientation (position, rotation) and expression (which is also happening at my school:

I’m working with these people right now). And 2) eye-to-eye capture+display (i.e., the person interacting needs to be able to look directly into the eyes of the person displayed). There are some easy ways to get eye-to-eye that keep the screen at a distance from the person interacting… but finger-to-finger would be even better.

Ha - I just checked out the ocean_V1 project - its nice!

The project I am working on is actually quite similar: combining DSLR and emotion/expression tracking, no networking involved though just a single installation. I am waiting to hear back about the status of funding but if it goes through maybe it could be nice to collaborate on kyle/ole ?

Theo

Theo, that would be great. I’d be curious to read your proposal if you could email it to me or post it.

The expression tracking stuff happening at my school isn’t very open, unfortunately. I tried to just get some libraries from them, and they were super worried I was going to go make a mass-distributed commercial product. Right now they want me to send me some data for pre-processing, and prove that I can do something interesting non-real-time before they even let me use the library… but their papers are all publicly available, so theoretically it can be recreated.

sure will do!

Actually the people I am working with know someone in amsterdam who has developed an emotion recognition library - they’re going to get in touch and see if they would be interested in working on the project / sharing the code.

the url is:
staff.science.uva.nl/~gevers/

Could be interesting - more emotions than expressions though.

Theo

Sounds like a really great installation this!

I know a guy who has just moved here from Copenhagen to Rotterdam in the netherlands who is also a “arty guy” and wrote his master about tracking a face in 3d (finding out how the head is rotated etc…)

@ole, @jonas, looks great !!! congratulations!

@theo, have you seen this –

http://www.christian-moeller.com/displa-…-ject-id=36

it’s pretty funny use of emotion recognition (smile detection, really) + performance.

take care!
zach

ahh cool - yeah I forgot about that piece - its really awesome & cheesy!

[quote author=“halfdanj”]Sounds like a really great installation this!

I know a guy who has just moved here from Copenhagen to Rotterdam in the netherlands who is also a “arty guy” and wrote his master about tracking a face in 3d (finding out how the head is rotated etc…)[/quote]

Ohh cool sounds really interesting - might be nice to meet him as I am in Amsterdam but teach in Rotterdam once a week.

Thanks!
Theo

This is some crazy software:

http://www.seeingmachines.com/faceAPI.html

Hey Vanderlin

I had a chat with the guys from seeinmachines and the api could be integrated with OF. We are working on a project that requires face tracking so we may give it a go. Just PC though :frowning:
They are planning on mac but it sounded like it was not a short term plan

About controlling a dslr from a computer we tried a different solution with pretty good results. We used the eyefy card and transmitted the photos wirelessly to a folder on our computer. Then we picked them up to manipulate them live. It is not quite immediate but the photos were arriving to the computer within seconds of being taken.
Also if you have a good network you are free to move around with the camera.