Kinect vs IR camera in theatrical performances

Hi everybody,

I’ve been using IR cameras in 2010, then just kinect until today. I have a proposal for a theatrical performances where I should track dancers from above, that is, only blob tracking. So the stage becomes an interactive floor. I was thinking to use kinect because I made many interactive installations with it, but reading this post

I’ve found that it could not be the best solution. Probably IR camera can do better in this particular situation.



  1. it doesn’t require any lighting,
  2. No problems with other lights on stage.
  3. just set near/far and blob detection works like a charm.


  1. only 30 fps,
  2. limited size of the interactive area on stage because FOV is not very wide.
  3. video signal is hard/expensive to extend from the stage to the theatre control room.



  1. can reach 60 and more fps -> it can follow the dancer’s moves
  2. can have a very wide angle lens -> it can cover the entire stage without requiring multiple cameras systems
  3. can send signals over ethernet -> video signal cover any distance


  1. very sensible to light conditions
  2. stage must be IR illuminated
  3. stage must results almost “black” and bodies must result almost “white” otherwise no blob tracking.

So, which is the best solution?


1 Like

I did two interactive dance shows/choreographies ,
I used the kinect, but I had in mind the downsides from the start,

extra cons:
a) kinect must be close to the dancers (but it has short cable)
b) (under windows) it can stuck in boot if app did not quit properly the last time, then you need to plug and unplug the usb on reboot

I never tried IR camera mainly due to time limitations,

I am a big fan of gige cameras with IR. Yes lighting is important, but the resolution available (look at the 10gige versios) and the reliability of the system is great. You can change lenses, get precise control of the camera and feed power via the cat cable, and filters. I use an IR pass filter so that not only am I getting IR frequencies, but I am blocking all non-ir frequencies. This makes the lighting easier, but yes it is still work. If you get lights that can be controlled via DMX, it is great to integrate the tracking lighting in with the normal lights, it is easy to calibrate with a calibration scene, cameras on and then saving a scene with the lighting desk- you can then control the desk from your application.

As for the lighting, as you only need contrast between the dancers and the floor you have a lot of options, and costumes and costume materials will also be important to test. White floors with very low angle controlled IR can be a nice way to go.

Thanks fresla,

it’s almost all clear. I have a black floor and almost naked dancers. So they should create a pretty good contrast with the floor. In such case, what about place IR lamps on the ceiling and playing with the black background? does “low angle controlled IR” mean that I have to place lights around at the border of the stage and try to light up dancer without lighting the floor?



Take into account that what looks black to the eye could be totally white for an IR filtered camera but lighting the way you mention, trying to keep the floor in the dark, should be enough

As many mentioned, I also agree, my first attempt would be to light up dancer with ir lamps from the floor, keeping the floor in the dark
To have this lightning set tup, you can use theater lights, you can “cut” the light beam as desired, and put blue and red filters to make an IR pass filter
If you want to light up stage for the audience to see dancers, you could use led lightning, so it does not interfer with IR

You cannot believe it, but i’m just finishing a thing in the mostly the same condition (probably even worse, as the floor was reflecting IR, jamming my camera). I build a device that integrated into an addon i’m developing as ofxClayblocks:

i tried with kinect but the stage to track was too large (8x8m) so i ended using a raspberry pi with a pimoroni fisheye camera (160 degrees).
I had to make the IR filter by hand but cutting two congo blue gels and one primary red and putting it inside the camera after the sensor. (the gel filters were something like this —> )

The show was on a platform that had some plastic white floor on it, that was reflecting IR so we had to put the light to illuminate them from the side, we did it with six of those:

The fact that your dancers are almost naked could be a problem, as the chemical used for dye clothes are mostly IR reflective. If you have to buy an IR illuminator alway buy the ones with those big diffused leds, the ones with many little leds just make spots instead of flooding with light ( we had to send them back).

All the background subtraction and blob detection was made on the raspberry pi and send as OSC messages to my computer, there is a client class in the repo that manages all the code for receiving data and exposes a blobs public variable that are the received blob. The documentation and pictures for the setup still have to be included in the repo, and the defisheyeing calibration could have be done better, but i’m very busy finishing this thing at the moment, so check again in a while.

1 Like

Thanks to npisanti and all the others. It looks that the main problem of the kinect is its limited interactive area and the need to hide the camera on the ceiling of the stage, that can be several meters height.

At the same time, you need a system that can be adapted to different kind of stages and lights, otherwise the show cannot go on a tour.

Probably playing with the infrared light within its reaction to the costume of the dancers and to the floor surface, could be a good starting point for a robust and portable interaction system.

for materials / stuff you can also check this thread on the vvvv forum

we also tried with reflective safety tape as adviced in the thread, but it works well when the illumination has the same position and direction of the camera, we needed the IR lights from the side so we couldn’t use that.

I’ve never given a try, but sparkfun has this kind of kinect-like camera with quite good specs

in the specs:

Depth Working Range : 20cm - 350cm

It seems a really good camera with high fps for near interactions, but the distance for depth is even less than a kinect v1 so won’t be good for large stage

(anyway thanks for sharing, could be useful for other things!)

Thanks for sharing this new device. It looks very interesting. I found this video about it

I do lots of blob tracking and IMHO what make the difference is also the quality of depth frame (noise, black areas, …) . There are many new devices on the market and a very useful help could be, just sharing some depth images or depth movies. By this way, anybody could test the camera with its own software and see which is better… This happen with reflex camera and similar…