[SOLVED] Best way to set ROI with openFrameworks?

Hello !

i have problem retrieving distance from kinect when using a ROI.
i am detecting blobs in a kinect image and apply a ROI to limit area of detection.

when using blob.distanceAt(centroid) i can’t get the distance properly.

i noticed that when i am deactivating ROI problem disappear.
same when ROI is activated but with ROI size = kinect image size.

I think i am doing something wrong with ROI. It seems something is going wrong with coordinates, is there any offset when finding blob centroid when ROI is set ?

Any help would be great !

see code below :

void kinectTracker::update() {
        // load grayscale depth image from the kinect source
        depthImage.setFromPixels(kinect.getDepthPixels(), kinect.width, kinect.height);
        // ---------- ROI -----------
        ofRectangle roiMat = ofRectangle(roi.x, roi.y, roi.width, roi.height);
        // threshold
        thresholdImage = depthImage;
        // update the cv images
        //find blob in ROI
        contourFinder.findContours(depthImage, minBlobSize, roi.width*roi.height, 1, false);
    if (contourFinder.nBlobs > 0 && contourFinder.blobs[0].area > minBlobSize)
        pos = contourFinder.blobs.at(0).centroid;
        pos.z = kinect.getDistanceAt(pos.x, pos.y);
        pos = ofVec3f(0);

Thanks a lot.

blobs are shifted by the ROI offset, so if the ROI starts at x = 100, then a blob that normally is at position x = 150 is now at position x = 50, so we need to add roi.x to blob x position to make it in the right place again.