ofxKinectv2: detect handstate (open/closed) sideways

Hi,

I’m making a game for a children event where the user can control a bird puppet. Like this:

Calculating the joint angle between the shoulder, elbow and wrist is no problem - but when it comes to detect whether the hand is closed or open (to control the birds mouth) it fails.

I use player.joints[WristRight].getPosition() , player.joints[HandTipRight].getPosition() and joints[ThumbRight].getPosition() to calculate the angle, but [ThumbRight].getPosition() seems to be quite inaccurate - sometimes the Kinect even thinks the ThumbRight position is above the HandTip. The mouth open/closes all the time when it doesn’t have to. I also tried measuring the distance between WristRight and ThumbRight. No luck. Tried several Kinects (V2) - thumb position is just not nearly as accurate as in the video above.

Could anyone help me out on this one? I’m currently using the HandState (OPEN/CLOSED) to detect whether the hand is closed or open, but it doesn’t really work when the Kinect is placed sideways as seen in the video.

Thanks in advance! I hope I can sort this out so I can offer the children a nice game to play :smile:

Bump!

My understanding is that Kinect V2 skeletal tracking only finds a thumb and another fingertip on a hand. It is a part of body tracking but not hand tracking. To achieve what you want, you need to develop on your own using OpenCV (something like this https://www.youtube.com/watch?v=NeHX5jzHFM4 ) or use Leap Motion.