OpenNI Skeleton Tracking

this is how open source works! yes!

i know github will let you add many admins to a repository (this is how ofxKinect or libfreenect works), but i’m not sure if you can remove the original user to change the url. good luck :slight_smile:


so, how shall we continue organizing before we begin coding?
Coding style, protocols, design, workflow, etc.

I think it would be good to move to a different thread.

Not sure if this warrants its own thread but has anyone tried using the Asus Xtion with the ofxOpenni addon?

I’m writing an application which runs fine with the kinect but then crashes when using the getMetaData() method in DepthGenerator? Its crashing at this method xnUpdateMetaDataBeforeFirstRead(XnInternalNodeData*) + 6

This is on osx 10.6 with OF6.2

Does this need OF6.2? I’m on Linux 007 and I’m just digging a hole trying to get this to compile.

Thanks =)

I haven’t tried the Asus, moreover I know no one that owns one to borrow and make tests.
But I guess it should work as it is compliant with openNI. Maybe some minor changes might be needed in ofxOpenNi to make it work.

Are you calling directly the getMetaData method from your code or is just ofxOpenNI that crashes when it reaches that line of code?

I’d recomend you to update to the latest version of OpenNI and NITE.
getMetaData should be called after all the ofxOpenNI updates, especially the context update. The error thrown sounds like that you are calling this method before openNi updates thus when it gets the data from the kinect.

Not at all.
Making it to compile it’s not that hard. Just make sure that you add the correct paths to the OpenNI and nite includes and libs.
This can be either the ones installed in your computer or the ones that come inside the addon.
In the case of using the ones of the addon you might have trouble with the relative path linking.
If you have installed OpenNi in you computer this should be in /usr/include/ni and /usr/include/nite
The same apples for the libs.
The installed libs should be at /usr/lib

good luck!

Hey awesome peoples!

Thanks so much for all the feedback over the last couple of days…especially Roxlu for getting onto me so quickly, and having such a great approach to developing this project!

My apologies for having gone offline for a couple of days - I’ve been working on a show using ofxOpenNI that opened on Friday. I’m super excited because a) it’s the first time I’ve got to use the addon in a live show (we’re using it to create registered projections for set elements and as a virtual lighting system for all performers and puppets) and b) because the show is about Indigenous aboriginal creatures from around Australia, and is one of the first times some of these creatures have been allowed to be talked about and performed for general public audiences…

…but I digress…

I agree that we should move discussion of further development of ofxOpenNI to a new thread: this one is monstrously long…and by the time you read 14 pages of conflicting implementations, instructions, forks, pleas for help, etc anyone trying to start out is bound to be confused.

So on that note I’m opening ofxOpenNI-Development

Once we’ve sorted out how to merge/move the repo’s I’ll start another thread for support, installation and implementation help and update a link here.

Were are you from gameover? Im a melbourne dev doing similar things. Is the show still on, im in Sydney on saturday

I am indeed based in sunny Melbourne :wink:

The show is still on: it’s made by a company called Erth and is called ‘I, Bunyip’. It’s part of the Sydney Children’s Festival - so it’s obviously aimed at kids - and is on at Carriageworks.

Let me know if you manage to catch a show (or indeed PM me if you have trouble getting in - I think a lot of sessions are sold out). And get in touch when you’re back in Melbourne!



I have manage to recompile the library and everything fine. However I am wondering how I can make the system recognise specific gesture and train/teach it to recognise new one.

Many thanks

openNI can recognize hand gestures like hand up, wave and push with the current implementation of ofxOpenNI.
Use the ofxGestureGenerator class for this task.
I guess that you can recognize more gestures but I haven´t gone that far in this issue.
Read the OpenNI and NITE docs and check the posibilities that you can get for gesture recognition.
If there’s no option you might need to use an external gesture learning and recognition library.
good luck!


Thanks for the feedback. yes I think I have to read further the OpenNI and NITE doc.

WHat is the purpose of the recorder class? It outputs a *.oni file.
At first I thought it was to do with recording a gesture and then some how load it and being able to use to trigger something. But it’s not, is it?


the recorder class records the camera input. It’s quite useful for debugging.

good luck with the reading.


i am trying to get ofxopenni working on osx 10.6.8 with xcode 3.2.4 and OF007

i followed gameoverhack’s instruction from here:

git clone
cd ofxOpenNI
git checkout experimental (also tried git checkout master)

the example compiles fine but when i try to run it i get this error:

dyld: Library not loaded: /opt/local/lib/libusb-1.0.0.dylib
Referenced from: /Applications/of_preRelease_v007_osx/apps/addonsExamples/opeNI-SimpleExamples/bin/…/…/…/data/openni/lib/libOpenNI.dylib
Reason: image not found
sharedlibrary apply-load-rules all
Data Formatters temporarily unavailable, will re-try after a ‘continue’. (Cannot call into the loader at present, it is locked.)

it’s strange that it does not see this library: libusb-1.0.0.dylib

because i copied the lib folder in the bin/data/openni/ location
and it does show up in xcode (it’s not highlighted red)

what am i doing wrong?


ok i got it to work.

using the terminal i cd pathToMyLibFolderInsideTheDataFolder
then dragged the script on the terminal and hit enter

i guess this somehow change some paths inside the libraries

yes you must run that script.

it changes the relative paths inside the libs so these get linked correctlly.

Actually you shouldn’t need to run that script as it sets relative paths and should already be correct…

However a few people have reported problems with libusb after I did the latest update to openNI drivers…

If running that script makes it work for you, please let me know (that would mean I did not run it on libusb)…

However if that does not work you may need to install libusb on your computer (it is both a dependency of openNI and is used by the ofxHardwareDriver so I need to figure out which one is trying to load the dyld from the wrong place)

To install libusb you can use this command in Terminal:

sudo port install libusb-devel +universal  

If you could let me know which solution solves the problem it would help me as it’s quite hard to check the install on every computer/platform configuration…


all i did is download ofxopenni from your github, followed the instruction in the read me, renamed one of the src-ImageAndDepth-Simple folder to src and ran the script

i tried running some of the other src folder but ran in to some problem, which i don’t remember right now in detail.

thanks for this addon

This however does require MacPorts which in my experience downloads a bunch of other libs. PCL has a nice version here that installs libusb into opt/local


followed the instructions in and, voilá, works like a charm! one question: does that codebase allow for skeleton access? i can see my hand being tracked and all that but not my skeleton (skeleton tracking is on).

does this implementation require the user to “strike the pose”?

the openni sample (Sample-NiUserSelection) works fine.

thanks a lot!


This library does indeed track skeletons. I added this to the sample code to see the skeletons in the draw() function just below recordUser.draw():

			int nTrackedUsers = recordUser.getNumberOfTrackedUsers();  
			ofxTrackedUser *tracked = NULL;  
			for (int i=0; i<nTrackedUsers; i++) {  
				// get the user  
				tracked = recordUser.getTrackedUser(i);  
				if(tracked != NULL) {  

But it seems it does require the “special pose” in order to be detected. Does anyone know if non-pose detection is possible in oF on Mac (without using BootCamp + Windows)?

try the experimental branch

it has a different implementation and it’s supposed to be v2.0
there you have no pose skeleton tracking.

don’t be afraid of the word “experimental” it works really well. I’ve been using it for months now.