ofxKinect sample code for depth measurement and opencv


Playing with ofxKinect sample in the add-on and I have a couple of questions.

Can someone please explain why a cvAnd operation is needed for contour identification.
Couldn’t quite follow the near/far plane logic mentioned in the code either.

		// we do two thresholds - one for the far plane and one for the near plane  
		// we then do a cvAnd to get the pixels which are a union of the two thresholds  
		if(bThreshWithOpenCV) {  
			grayThreshNear = grayImage;  
			grayThreshFar = grayImage;  
			grayThreshNear.threshold(nearThreshold, true);  
			cvAnd(grayThreshNear.getCvImage(), grayThreshFar.getCvImage(), grayImage.getCvImage(), NULL);  

My second question is related to depth measurement and I am using **kinect.getDistanceAt(x, y) **

I am trying to highlight the nearest point from the depth image.
Most of the time my marker identifies the correct closest position. However, sometimes the marker goes way off and points to arbitrary positions on the image. I am iterating over the entire depth image to identify the closest point in every update() call.

My question is - Is this the correct way to identify the closest point from the depth image?

			closestDistance = 100000; // some large number to begin with  
			for(int width = 0; width < grayImage.getHeight(); width++) {  
				for(int height =0; height < grayImage.getWidth(); height++) {	  
					int dist = kinect.getDistanceAt(width, height);  
					if(dist > 0) {  
						if (dist < closestDistance) {  
							closestDistance = dist;  
							closeX = width;  
							closeY = height;							  

Thanks a ton

1 Like

the cvAnd operation helps process the image into something simpler (like a mask) that the Contour finder can outline

the near/far threshold are values that allow you do do stuff like “give me only objects between 2 and 3 feet away”

#2 looks ok to me -

Thanks. That makes sense.
So in that case, shouldn’t the near image (closer to the depth camera) threshold value be smaller than the far threshold?

Currently, in the code the near and far threshold values are set completely opposite. NearThreshold value is greater than the farther one:

nearThreshold = 230;  
farThreshold = 70;  

Also, based on your explanation - the near and far thresholds should correspond to some approximate physical distance. Is it?

Again, thanks a bunch

Hey Dece, did you ever figure out what the deal is with the near and far threshold values being seemingly reversed? The purpose for nearThreshold and farThreshold make zero sense to me, even after days of trial and error. I have the following code in the “void testApp::drawPointCloud()”:

for(int y = 0; y < h; y += step) {  
		for(int x = 0; x < w; x += step) {  
            if(kinect.getDistanceAt(x, y) < farThreshold) {  
                glPointSize(scaledVol * 190.0f); // "scaledVol" is just a variable that changes point size based on audio input, "190.0f" just to scale the effect  
            } else {  
                glPointSize(3); // initially 3; size of each point  

Now notice how I have “farThreshold” in the if statement that judges whether or not the audio input effects the point at the given x,y coordinate.

When I have:

nearThreshold = 230; //default value
farThreshold = 0; //or anything less than 1, incl decimals like 0.9

None of the points are affected. When I have:

nearThreshold = 230; //default value
farThreshold = 1; //or anything greater than 1

All of the points are affected. It’s like these thresholds are labeled wrong, or are playing evil tricks on me. Anybody care to explain this thresholding to a noob? Thanks in advance!

PS I guess I’ll replicate this exact post in another thread I have going that is at this point completely relevant, seeing as this thread is a year old, if I don’t end up getting a response here.

nearThreshold and farThreshold are just 2 variables with names that suggest what you plan to store into them; what really matters is what you do with them.

if for example you want to create a “sensible area” that begins 1m from the kinect and ends at 3m far, you woud set:

nearThreshold = 100;  
farThreshold = 300;  

or, if you want to get everything nearer than 3m, you would set:

nearThreshold = 0;  
farThreshold = 300;  

Once you defined your depth range, at a certain point in your code, you’ll have to filter your kinect points with the for cycles you mentioned:

for(int y=0;y<h;y+=step){  
   for(int x=0;x<w;x+=step){  
      if(kinect.getDistance(x,y)>nearThreshold && kinect.getDistance(x,y)<farThrshold){  
         glPointSize(scaledVol * 190.0f); //or whatever you want to do when your condition is met  

if it does not behave as expected, there is probably something weird in the rest of your code: filtering points by their distance is really this easy :slight_smile:

Man, I have no idea why this isn’t working for me. So you’re saying that nearThreshold and farThreshold directly translate as such (100 = 1m), and that they should have such a relationship to the point cloud? I figured out that nearThreshold and farThreshold variables are initially oriented for masking the “grayImage”, not the point cloud, and that they translate as nearest = 0, furthest = 255…

it depends on the context: they are just 2 variables, you decide how to use them.

it could be that you’re looking at the ofxKinect example and you’re mixing stuff? in the example nearThreshold and farThreshold are two int that are in fact used to threshold the raw depth pixels using “the OpenCV way”.

in the code you posted you’re using the getDistanceAt() function which returns the distance from the kinect at point x,y in float format (not the value of that pixel in the depth image).

How would you implement this with ofxCv instead of ofxOpenCv ?
There is a setThreshold function on ofxCv::ContourFinder but how do you use it for far and near thresholds ?