Thanks rsodre & ascorbin!!
Ok so I’ve made a bunch more changes including merging rsodre/ascorbin gesture/hand tracking…
To begin with:
* I’ve successfully recompiled the openNI, SensorKinect and NITE dylibs so that there external reliances on libJPEG and tinyXML are now internally compiled -> this means **no more conflicts with ofxXMLSettings!!! **I compiled against latest unstable from the openNI, NITE and avin’s SensorKinect git repositories…I also made the libusb.dylib portable as this way just about any mac should be able to run our apps with no driver installs…would be great if those trying to get this working on linux try these new drivers (or did you guys have to compile others?)
@hv_francesco: you shouldn’t need to comment out any code, the toggleRegisterViewpoint function is called from inside the testApp -> I’ve added a key ‘r’ for toggling registering so perhaps try that a few times and report back -> seems strange that it should muddle with user detection…hmmm
And then:
* It seems roger and I were working in parallel on incorporating ascorbin’s handtracking example…I realised half way through that an ofxGestureGenerator class should actually only be about recognising gestures and NOT about hand tracking (which rather requires a focus gesture to trigger the beginning of tracking), so I started to do the abstraction/seperation…I’m not sure which way is better…so
…for the purposes of testing I now have three different (working) versions of ofxopenNI on my git repository:
master : no gesture or hand tracking (just updated newly compile libs)
develop : merge of our methods for ofxHandGenerator + multiple hands working (no events for gestures)
experimental : merge and then separation into ofxHandGenerator and ofxGestureGenerator (using ofEvents) + multiple hands working + more smoothing options (both openni and custom)
EDIT: I’ve merged across branches now -> using the experimental approach above with separation of gestures and handtracking and added motor, led, accel control as per below…
* Multiple hands are now working in both the develop and experimental versions -> you’ll notice that the RaiseArm focus gesture triggers a lot and the best way to get fast recognition of a second (or nth) hand is to make sure you lower the hand that has just been recognised and then raise the other hand…
* Also merged a bunch of Roger’s changes to depth texture drawing, getting pixel colors and a few other things I’ve forgotten…
So I guess that’s a bit confusing?? I’m betting on experimental but thought I’d get feedback before merging across branches…
Remember if you want to try hand tracking you need to checkout the develop or experimental branches at the moment…
M