Threading and opencv

Hi, guys!

I have tried to make ofxThread and ofxCvHaarFinder work together with no success.
The idea is to show image from web cam overlaid with face rect with face detection in separate thread so it doesn’t block camera video.

So far I’ve modified thread example and stumbled upon an error. Is it my poor knowledge of threading concept or opencv itself I don’t know.
My threaded class:

#include "ofMain.h"  
#include "ofxThread.h"  
#include "ofxOpenCv.h"  
class threadedFinder: public ofxThread{  
	ofxCvGrayscaleImage img;    
	ofxCvHaarFinder finder;  
	void start(){  
		startThread(true, false);   // blocking, verbose  
	void stop(){  
	void threadedFunction(){  
		while( isThreadRunning() != 0 ){  
			if( lock() ){  

The code in main app where the finder gets called:

void testApp::update(){  
	Boolean bNewFrame  = false;  
	bNewFrame = vidGrabber.isFrameNew();  
	if (bNewFrame){  
		if (TF.lock()) {  
			TF.img.setFromPixels(img.getPixels(), 640, 480);  

The problem happens on TF.start(). I’ve dived into the code and found that call to cvSetImageROI fails.

Whats going on?

can you post the whole code (or a short example) as a zip that shows the problem? it’s a bit hard to tell if everything is allocated properly from the pieces you’ve posted here. Usually ROI problems relate to trying to use something before it’s been allocated.

take care,

Here is the whole project.

my first thought is check on: TF.img

is it ever allocated? I don’t see it. I see where img (at the testApp) level is allocated, but I think you are missing a second allocation.

hope that helps!

zach, thanks for idea!
I’ve modded threadedFinder.h constructor to look like this

		img.allocate(640, 480);  

but the problem is still here. I have tried it without lock/unlock too.

The problem is ofxCvImage.allocate(). It uses openGL making impossible to use itself in multithreaded program.
As far as I know it’s possible to use openGL in multithreaded env. It’s the design of ofxOpenCv which prevents me from doing multithreading. That’s a pity.
I’ll try to find the solution.

ofxCvImage.allocate(). It uses openGL making impossible to use itself in multithreaded program.
As far as I know it’s possible to



this disables the texture and all opengl aspects of ofxCvImage.

hope that helps,

I had a lot of problems with multi-threading until I used ofxRuiThread. The ofxThread works fine, but ofxRuiThread builds on top of that framework and makes multi-threading painless. I have my CV on its own thread and it is helping a lot.

Anyone get any luck with this?

I tried to build an app running ofxCvHaarFinder against videoGrabber and while detecting faces my fps dropped from 60 to about 8.

I’ve tried using the ofxCvGrayScaleImage.setTexture(false) but it didn’t help me out.

Ultimately I was hoping to run a few instances of ofxCvHaarFinder to detect different blob types.

I put all the CV stuff on its own thread and use a boolean value surrounding the analysis part of it – in your case the haar classifier. When I check for the data from the analysis, I first check the boolean value to see if the CV is in the middle of processing it or not. I know this is a hack, but it works great–the locking mechanism in threading doesn’t work so well. You also might want to check out OpenMP for more multi-threading stuff… it is something I need to look in as well as a possible alternative, I’ve heard that this works pretty well.

Thanks Ken,

Trying ofxRuiThread now and it looks like my threads are running once but I am not sure how to keep it running. The class has this comment in it but I am not sure how to implement it as the example isn’t clear to me.

//this should be used in junction with updateOnce()  
	//to make sure your threads are synched with the main thread  
	//(check the example)  
	virtual void waitToFinish(); //waits till updateThread() is finished  
	virtual void blockLock(); //blocks the app until a lock is successful  
	virtual bool tryLock(); //tries to lock and returns true if successful  

Seems I got it going - one weird thing is that I had to hardcode ofxCvImage to use bUseTexture= false;

Attached is my xcode project in case it helps anybody (missing haarcascade_frontalface_default.xml due to file size)

edit: updated version lives here

How is your FPS now with the multi-threading?

its weird - ofToString(ofGetFrameRate(), 1) is reporting higher fps than the app is set to but I can tell the source video doesn’t go to a crawl as it did before

It also depends on which haars I use

Another thing I can tell is that I can keep adding HaarFinders to it without it crawling


I am working off the MultiThreadedHaarFinder example.
But I modified it to run a contourFinder. I have the contourFinder inside the SimpleThread.h file but when i try to run it i get a bad access error.

when i comment out the contour finder part the app runs fine.

my SimpleThread.h file looks like this:

#pragma once  
#include "ofMain.h"  
#include "ofxRuiThread.h"  
#include "ofxOpenCv.h"  
#include "ofxCvBrightnessContrast.h"  
class SimpleThread:public ofxRuiThread{  
	bool isReady;  
	int camWidth;  
	int camHeight;  
	unsigned char * videoPixels;  
	ofxCvColorImage		colorImg,colorImgDistorted;  
	ofxCvGrayscaleImage 	grayImg,grayBg,grayDiff;  
	ofxCvBrightnessContrast brightnessContrast;  
	ofxCvContourFinder 	contourFinder;  
	int softBlur;  
	int softBrightness,softContrast;  
	bool bLearnBakground;  
	int threshold;  
	bool flipVerti,flipHori;  
	int minBlobArea;  
		isReady = false;  
		camWidth = 400;  
		camHeight = 300;  
		softBlur = 1;  
		minBlobArea = 1000;  
		threshold = 20;  
		softBrightness = 30;  
		softContrast = 30;  
		colorImg.allocate(camWidth, camHeight);  
		colorImgDistorted.allocate(camWidth, camHeight);  
		grayImg.allocate(camWidth, camHeight);  
		grayBg.allocate(camWidth, camHeight);  
		grayDiff.allocate(camWidth, camHeight);  
	void updateThread(){  
		isReady = false;  
		colorImgDistorted.setFromPixels(videoPixels, camWidth,camHeight);  
		colorImg = colorImgDistorted;  
		colorImg.undistort(	0.052606, -0.121024,  
						   -0.000382, 0.001259,  
						   333.119415, 333.770691,  
						   203.354645, 148.207092);  
		colorImg.mirror(false, false);  
		grayImg = colorImg;  
		if(softBlur > 0) grayImg.blur(softBlur); //int(softBlur/2));  
		brightnessContrast.setBrightnessAndContrast(grayImg, softBrightness, softContrast);  
		if (bLearnBakground == true){  
			grayBg = grayImg; // the = sign copys the pixels from grayImage into grayBg (operator overloading)  
			bLearnBakground = false;  
		grayDiff.absDiff(grayBg, grayImg);  
		contourFinder.findContours(grayDiff, minBlobArea, (camWidth*camHeight)/2, 10, false);	// find holes  
		isReady = true;  

I wonder if it is the problem I was having with the textures - setTexture(false) wasn’t working for me so I changed ofxCvImage bUseTexture=false;

hey thanks

that stopped the error.
i still have to test is the contours are actually being found, but i guess i am good now.

thx a lot.

do you know why we need to makes this hack?

so i tried it out.

it works fine. i can tell my main loop to display

thread.colorImg.draw(0, 300, 160, 120);  
		thread.grayImg.draw(0, 420, 160, 120);  
		thread.grayDiff.draw(160, 300, 160, 120);  
		thread.grayBg.draw(160, 420, 160, 120);   

but when i tell my thread to take a new background image, it works for a split seconds and than i get an other bad access error

here is my project.
it uses ofxQTKitVideoGrabber but this can easily being changed to ofVideograbber

any ideas?

You have a few addons that I don’t so I wasn’t able to compile your project.

I am not sure why this hack is needed - when my project was crashing I just noticed in the debugger that bUseTexture was set to true when I had set it false before. Maybe there is a bug in the setter or something else is setting it back?

Sorry I can’t help more - I am only about a month into playing with OF/C++


i forgot to reset my ROI.
After adding cvResetImageROI(colorImg.getCvImage()); everything worked.