ofxOpenNI Development

@iago,
I really doubt that the kinect will run on low cost cpus. I cant recall which are the minimum specs. maybe @gameover might know.
which OS are you using?
oxfCvHaarFinder is the only example that doesn’t work?

I think that it isn’t a GPU issue because openNI returns to OF a pixel array with which OF updates a texture in the gpu.
So first try all the GPU related examples.Let me know how it goes.

best regards

I got libfreenect (and I think ofxKinect) to work on a Chumby 8 a while back - as expected it was really slow.

With most ARM devices the issue seems to be USB bandwidth (my experience with the RPi and Beaglebone)

The only ARM device I have seen that works decently with the Kinect is the gumstix which is what the Drill of Depth was using
http://www.rowland.harvard.edu/cox/projects/subprojects/kinect/

I am curious about the Intel Atom boards perform

Good evening.
Im having some frame rate problems running my app. Frame rate sometimes drops down to 10-20 fps. Most of times I run the app (on Visual Studio) or play the .exe frame rate its quite good varying from 40-100 fps (its quite unstable).

When framerate falls, i can solve the problem by runing 1-5 times the app. Ive the same problem runing the app on laptop (ATI Radeon hd5650) either desktop (gtx 560 Ti) and both CPU are i5. Sometimes, after keeping the app running, the frame rate recovers itself (but somethimes the only way to solve the problem its restarting the .exe).

Im using Primesense xx0.41, OpenNI xx4.0 and NITE 1.5.2.21. When the app starts, the common error "Could not open file mapping object (2) " is being typed 2 times. When i was running OpenNI 1.5.2.23 & sensor 0.25 drivers this error was being printed like 1 time per second. Im just telling this because maybe both problems are related.

May you give me some clues? Tyvm

After trying my app on differents laptops I’ve come with the conclusion that the problem occurs ONLY when drawing depth/RGB images on screen. The only common thing I notice in all laptops that doesnt work is that they perform just OpenGL 1.4, so I think it must be some OpenGL-related problem. However, I was able to run my app in Dual Core 1.33GHz, 2GB RAM with Mobile Intel 945GM Express GPU. The app works, and the point cloud representation as well, but the depth/RGB representation doesnt.

@roymacdonald, Im using Windows XP/7 with Visual Studio right now, although Im planning to croos-compile to Linux (and maybe give a try to ARM)

@jvcleave I find very interesting the ARM advice you give me. Can you tell me the SO, libraries and fps you had with RPi and Beaglebone? I’ll do some research on gumstix, thanks. Also, I let you know when I try Intel Atom perform.

@iago,
it is weird because ofxOpenNI is actually dealing with an ofImage to draw the depth and rgb images.
have you tried calling setUseTexture(false); to your ofxOpenNI object? thats the only thing of which I ca think that has to do with open gl. by default, ofxOpenNI is set to use textures by default.

Hi there-

I’m just getting up and running with the ofxOpenNI + NITE, and I am only getting gesture recognition for “RaiseHand”.

what am I missing?

code: (from example Hand-Medium)
void testApp::setup() {

ofSetLogLevel(OF_LOG_VERBOSE);

openNIDevice.setup();
openNIDevice.addImageGenerator();
openNIDevice.addDepthGenerator();
openNIDevice.setRegister(true);
openNIDevice.setMirror(true);

// setup the hand generator
openNIDevice.addHandsGenerator();
openNIDevice.addGestureGenerator();
// add all focus gestures (ie., wave, click, raise arm)
//openNIDevice.addAllHandFocusGestures();

// or you can add them one at a time
//vector gestureNames = openNIDevice.getAvailableGestures(); // you can use this to get a list of gestures
// prints to console and/or you can use the returned vector
openNIDevice.addHandFocusGesture(“RaiseHand”);
openNIDevice.addHandFocusGesture(“Wave”);

openNIDevice.addAllGestures();
for(int i = 0; i < openNIDevice.getMaxNumHands(); i++){
ofxOpenNIDepthThreshold depthThreshold = ofxOpenNIDepthThreshold(0, 0, false, true, true, true, true);
// ofxOpenNIDepthThreshold is overloaded, has defaults and can take a lot of different parameters, eg:
// (ofxOpenNIROI OR) int _nearThreshold, int _farThreshold, bool _bUsePointCloud = false, bool _bUseMaskPixels = true,
// bool _bUseMaskTexture = true, bool _bUseDepthPixels = false, bool _bUseDepthTexture = false,
// int _pointCloudDrawSize = 2, int _pointCloudResolution = 2
openNIDevice.addDepthThreshold(depthThreshold);
}

ofAddListener(openNIDevice.gestureEvent, this, &testApp::gestureEvent);
ofAddListener(openNIDevice.handEvent, this, &testApp::handEvent);

openNIDevice.start();

verdana.loadFont(ofToDataPath(“verdana.ttf”), 24);
}

output:

[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-110-2951259] end position (x y z) [-110-2951259]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-117-1951257] end position (x y z) [-117-1951257]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-132-431166] end position (x y z) [-132-431166]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-156-731120] end position (x y z) [-156-731120]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-177-2791061] end position (x y z) [-177-2791061]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-159-621108] end position (x y z) [-159-621108]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [527-2721237] end position (x y z) [527-2721237]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_UPDATEDfor hand2from device0
[notice] hand gesture is 2
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-263-3211048] end position (x y z) [-263-3211048]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-151-3721152] end position (x y z) [-151-3721152]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[ofxOpenNIDevice[0]:verbose] (CB) Gesture Recognized: ID position (x y z) [-72-3961203] end position (x y z) [-72-3961203]
[notice] RaiseHand GESTURE_RECOGNIZEDfrom device0
[notice] HAND_TRACKING_STOPPEDfor hand2from device0
[notice] hand gesture is 2

i would like to addDepthThreshold to Kinect view
and it make the process really slow :frowning:
it’s normal ?

#include “testApp.h”

//using namespace cv;
//using namespace ofxCv;

void testApp::setup(){
//grabber.initGrabber(640,480);

ofSetLogLevel(OF_LOG_NOTICE);

numDevices = openNIDevices[0].getNumDevices();

myDepthThreshold = ofxOpenNIDepthThreshold(550,1500, false, false,true, true, true);

for (int deviceID = 0; deviceID < numDevices; deviceID++){
//openNIDevices[deviceID].setLogLevel(OF_LOG_VERBOSE);
openNIDevices[deviceID].setup();
openNIDevices[deviceID].addDepthGenerator();
openNIDevices[deviceID].addImageGenerator();
openNIDevices[deviceID].setRegister(true); // this registers all the image pixels to the depth pixels
openNIDevices[deviceID].setMirror(true); // flips the image and depth sensors
openNIDevices[deviceID].addDepthThreshold(myDepthThreshold);
openNIDevices[deviceID].start();
myDepths[deviceID].allocate(640, 480, OF_IMAGE_COLOR_ALPHA);

}

verdana.loadFont(ofToDataPath(“verdana.ttf”), 24);

gui.addTitle(“A group”);
gui.addSlider(“Near”,near,0,2000);
gui.addSlider(“Far”,far,1000,10000);
gui.addToggle(“setDephtValue”,setDephtValue);
gui.loadFromXML();
gui.show();
}

void testApp::update(){

for (int deviceID = 0; deviceID < numDevices; deviceID++){
openNIDevices[deviceID].update();
ofxOpenNIDepthThreshold &myDepthThreshold = openNIDevices[deviceID].getDepthThreshold(0);

if(setDephtValue){
myDepthThreshold.setNearThreshold(near);
myDepthThreshold.setFarThreshold(far);
setDephtValue = false;
}
myDepths[deviceID].setFromPixels(myDepthThreshold.getMaskPixels());

}

}

void testApp::draw(){
ofSetColor(255, 255, 255);

ofPushMatrix();
cout << “numDevices”<<numDevices<<"\n";
for (int deviceID = 0; deviceID < numDevices; deviceID++){
ofTranslate(0, deviceID * 480);
//openNIDevices[deviceID].drawDebug(); // draws all generators
//openNIDevices[deviceID].drawDepth(0, 0);
//openNIDevices[deviceID].drawImage(640, 0);
//myDepthThreshold.drawROI();

myDepths[deviceID].draw(0,0);

}

ofPopMatrix();

ofSetColor(0, 255, 0);
string msg = " MILLIS: " + ofToString(ofGetElapsedTimeMillis()) + " FPS: " + ofToString(ofGetFrameRate());
verdana.drawString(msg, 20, numDevices * 480 - 20);

gui.draw();

}

Hi guys just wondering if anyone has encountered this problem. I can’t seem to find anything on it in forum posts.

  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::GestureCB_handleGestureProgress(xn::GestureGenerator&, char const*, XnVector3D const*, float, void*)@20'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::HandsCB_handleGestureProgress(xn::GestureGenerator&, char const*, XnVector3D const*, float, void*)@20'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::UserCB_handleCalibrationStart(xn::SkeletonCapability&, unsigned int, void*)@12'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::stopCommon()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::setLogLevel(XnLogSeverity)'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::logErrors(xn::EnumerationErrors&)'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::isRecording()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::setLooped(bool)'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getLooped()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::setSpeed(float)'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getSpeed()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getCurrentFrame()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getTotalNumFrames()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getPosition()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::getIsONIDone()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::setPaused(bool)'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::isPaused()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::isPlaying()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::addAudioGenerator()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::removeAudioGenerator()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::allocateDepthRawBuffers()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::allocateImageBuffers()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::allocateIRBuffers()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::allocateGestures()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
obj\release\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| multiple definition of `ofxOpenNI::updateImagePixels()'|  
obj\release\addons\ofxOpenNI\src\ofxOpenNI.o:ofxOpenNI.cpp|| first defined here|  
||More errors follow but not being shown.|  
||Edit the max errors limit in compiler options...|  
||=== Build finished: 50 errors, 0 warnings (0 minutes, 10 seconds) ===|  
  

Hi, I’ve been developing with ofxOpenNI for a few months now. I’ve found it to work great for my project however I’ve been having some recent issues with tracking. I get this error with user tracking.

  
  
[ofxOpenNIDevice[0]:notice] Skeletonfoundfor user1  
[notice] USER_SKELETON_FOUNDfor user1from device0  
[ofxOpenNIDevice[0]:verbose] Force stopping user tracking1  
[ofxOpenNIDevice[0]:notice] Skeleton tracking stopped for user1  
[ofxOpenNIDevice[0]:notice] Stop tracking user1  
[notice] USER_TRACKING_STOPPEDfor user1from device0  
[ofxOpenNIDevice[0]:notice] Start tracking user1  
[notice] USER_TRACKING_STARTEDfor user1from device0  
[ofxOpenNIDevice[0]:verbose] (CB) Calibration end for user...1  
[ofxOpenNIDevice[0]:verbose] ...fail1  
[ofxOpenNIDevice[0]:verbose] Calibration requested cancelled for user1since maxNumUsers is1  
  

OpenNI sometimes loses tracking even when the user is stood still after being detected. Then they can’t get tracked again seeing as they’ve been lost but not deleted. What is the best solution to this problem?

HI cyport,
you’r problem is right there in your log:
Calibration requested cancelled for user1since maxNumUsers is1

user numbers start from 0 so user number 1 is actually the second one found.
I think that increasing the max number of users should make your problem go.

all the best!

Guys just wondering if anyone knows how to get the user’s image to isolate so that I won’t have the background surroundings about, as I wanted to import the user image into a scene that I have created.

@oceanmachine89 ofxOpenNIInstance.getTrackedUser(theTrackedUserID).getMaskPixels(); will give you the mask pixels to mask out the user from the rgb image. Check out the OF examples for alpha masking. there are several ones.
best

Thanks for replying however that doesn’t seem to be the problem as adding to the max num of users didn’t stop the error.

What I think is happening is that the calibration is a bit wonky and doesn’t work all the time, sometimes it will “hang” on calibrating and not delete the user, the calibration then detects you as a new user which adds onto the total num of users. I guess this is the cause for the number of users problem. The only workaround I have with this is that the calibration works without any errors when you walk into the sensor area rather than staying in the area to begin with.

Thanks for the reply. Sorry for being a nuisance.

Hi, all:

I would like to use ASUS xtion in Mac, and the one developed by JVCleave seems to be promising.
https://github.com/jvcleave/ofxOpenNI
However, I continuously received error message

0x8285f4: movl 56(%eax), %eax Tread 1:EXC_BAD_ACCESS(code=2, address = 0x38) "

no matter what version of OF and build SDK used.
(I tried OF 0.74 and xcode SDK 10.8, and ofxOpenNI suggests it developed under OF 0.7 and xcode SDK 10.6, so I tried again)

All version of ofxOpenNI develped by JVCleave has the same problem except the latest coming ofxOpenNI2.
However, there is no skeleton tracking (especially without calibration gestures) in ofxOpenNI2.

Does anyone has the similar situation?

Thanks JVCleave and anybody who can answer me in advance!!

adenovirux

@adenovirux,
that message means that your program is trying to access a memory location that it isn’t allowed to.
XCode will show you the line that’s throwing this error. If you post which line is, we might be able to help.
best!

@ Roymacdonald:

Sorry for the late reply. The error should lies in:

" XnEventInterfaceT<void (*)(XnContext*, void*)>::Register(void (*)(XnContext*, void*), void*, void*&) + 156 "

The attachment is the screen capture of related codes.

Thanks a lot!! Sincerely~ I really don’t have much ideas on memory location…

![](http://forum.openframeworks.cc/uploads/default/2913/OF xtion error .png)

I had code to draw a puppet using an old version of ofxOpenNI, I did some basic math to calculate the angles of each limb and then I make rotations using those angles and the positions of the joints:

  
	// left leg angle  
	leftLegAngle = atan(abs(leftLeg.begin.y - leftLegI.end.y) / abs(leftLeg.begin.x - leftLeg.end.x));  
	leftLegAngle  = (leftLegAngle * 180)/PI;  
			if (leftLegI.begin.x < leftLegI.end.x) {  
				if (leftLegI.begin.y < leftLegI.end.y) {  
					leftLegAngle = leftLegAngle + 90;  
				} else  {  
					leftLegAngle = -leftLegAngle + 90;  
				}  
			} else {  
				if (leftLegI.begin.y < leftLegI.end.y) {  
					leftLegAngle = leftLegAngle + 180;  
				} else  {  
					leftLegAngle = -leftLegAngle - 180;  
				}  
			}  
	imgLeftLeg.setAnchorPoint(0.5f, 0.5f);  
		ofPushMatrix();  
			ofTranslate(-torX * 0.3 -imgLeftLeg.width, torY * 0.85);  
  
			ofRotateZ(leftLegAngle);  
			imgLeftLegI.draw(0, 0);  
		ofPopMatrix();  
  
  

My code works. Not as smooth as I want, but that code is using an old version of ofxOpenNI which have some problems deleting users.

Now I’ve migrated to the new version of ofxOpenNI, the limbs are being deprecated, but still the can be used. I wonder if there’s a better and more elegant way to attach the pngs of each limb to the tracked user.
I have two questions:
Can I extract the angles more easily and attach the images without them being drawn outside the scope of the tracked user?
How can I scale the images properly in order to keep the aspect of the original puppet?
Thanks

**Update:
**

I found that the new code should look like this:

  
brazoI.getEndJoint().getProjectivePosition().y f  

or an old code like this

  
brazo.end.y  

However, limbs will be deprecated and I would like to know how to use the orientation methods instead already implemented in the new libraries. Any help will be appreciated.

I have the same issue now. I have setMaxNumUsers(1) and the calibration hangs far longer than it should. The skeletal tracking takes a bit as well - usually only when the user is already in the viewport.

Hi, I am trying to rotate images and text based on the rotation of the joints from ofxOpenNi. I was calculating a single axis of rotation, manually from the points but now I need to use 2 axis of rotation and I would like to use the rotation from the joints themselves. Here is my code.

    ofMatrix4x4 matriciesLeft[12];
    ofTrueTypeFont leftWords[12];

    matriciesLeft[i].preMultRotate(user.getJoint(JOINT_RIGHT_HAND).getDerivedOrientation());
    ofPushMatrix();
    glMultMatrixf(matriciesLeft[i].getPtr());
    ofSetColor(255, 255, 255);
    leftWords[i].drawStringAsShapes(leftStrings[i], 0, 0);
    ofPopMatrix();

When I try this code in another context, using a node I rotate over time and then get the matrix from the node it rotates as I expect. Once the user is detected (this triggers the words to draw) the text flies away from my screen.

I don’t really understand what derived orientation is, maybe that is the problem.

This works a little better for me

 ofVec3f vec;
 float angle;
 ofTranslate(0, 200);
 user.getJoint(JOINT_NECK).getDerivedOrientation().getRotate(angle,vec.x,vec.y,vec.z);
                
 ofRotate(angle,vec.x,vec.y,vec.z);
                
 rightWords[i].drawStringAsShapes(rightStrings[i], 0, 0);

But I would like to be able to just get apply the rotation matrix as it seems smoother when I test it away from openNi vs the second method here.

Fred