I want to do a little experiment where I display objects (let’s say a simple 3d box) overlaying the kinect rgb camera image (Augmented Reality style) and then I want to check for collisions with the tracked Hand (OpenNI)… for example to check if my hand is inside the displayed 3d box.
I guess the first thing to do is to create an ofCamera with the same FOV as the kinect rgb camera. Then i would have to find a way to convert the projected coordinates of the 3d box to the real coordinates (you can get then in millimeters from OpenNI) of the tracked hand or vice versa.
maybe someone has already done something like that and can help me on finding the easiest way to do that (especially the coordinate converstion thing…), i’m quite new to 3d and oF.
Collision detection for a box seems quite simple (get the minimum and maximum values for x,y,z) but for more complex models would i have to code some kind of bounding box check? does anyone have openframeworks instructions or tutorials on that topic? or is there already a library capable of doing those checks?
thanks in advance