Interactive facial projection mapping

Hi, I had a facial projection mapping demo at a virtual reality competition called IVRC in Tokyo for four days and tried by more than 100 attendees, which I used openFrameworks for marker tracking and projection. IR LEDs on the goggles are tracked by Kinect and a calibrated projector maps a texture which is drawn on an iPad.

It’s open source but not usable so I’ll try to addonize it some day to clean up.



this looks great. have you thought about tracking using ofxFaceTracker on the IR image instead? this could make it so you don’t have to use the goggles. the problem is the kinect’s IR image is not calibrated/registered to the depth image the way the color one is, so calibration could be difficult.

Thanks! I’m aware of Kyle’s addon and the next step should be using it or the Microsoft SDK’s face tracker. I already solved the IR registration problem but it’s not been merged yet:

that’s a great feature. but does it support aligning the IR to the depth, or just each of them to color? if you could get the real world coordinate of a point in the IR image, that would make everything much simpler.

Specifically, I made a function that converts an IR or depth coordinate into a color coordinate, and because of the performance, IR image registration is disabled. To do so, depth image must be non-registered, i.e., aligned to IR image. Also this feature needs not only (x, y) in the IR image but also a corresponding depth value.

Btw, FaceTracker probably doesn’t work when something is projected on the face.

you can use the IR image to avoid the feedback from the projection. the IR image is a bit speckled and lacks contrast, so it’s not as good as face tracking on RGB image, but it works to some extent.

1 Like

I tried face detection on an IR image, which actually works pretty well:

We really need to merge IR->color image coordinate functions to exploit this.

1 Like