ofxCvBlobTracker.h can't find

Hi All. I have searched HI and LOW on the forums, wiki, and website for ofXCvBlobTracker.h.

I have found ofCvBlobTracker.h again and again, I re-downloaded the 10.06 release to see if it was included there, but somehow I am missing something. Could someone suggest a link for a current version of ofxCvBlobTracker.h, ofxCVTracker.cpp, and ofxCVTrackedBlob.h?

Thanks very much!

I’ve never used those addons, but found this on googleCode:

http://code.google.com/p/mea3-10am333-2010/source/browse/Projekt/Code/-mea3-project/-ExtraAddons-/ofxOpenCvBlobTracking/?r=304

Excellent. Thank you very much mst1228–I didn’t think to look on googleCode.

I have successfully incorporated the ofxCvBlob addons with the stefanix code to achieve a working blob-tracker. I have been working on this for many hours and finally have it right.

For those of you who may, like me, be working to get blob tracking working in the current version of oF, here are some tips:

  1. Download the updated stefanix code posted by olliepalmer here: http://forum.openframeworks.cc/t/ofcvblobtracker/717/20 * I had to change some things in this code, so copy/paste the code I have below into the testApp.h and testApp.cpp files

  2. Go to the googleCode links and create each of these ofxCvBlob files, copying and pasting the code, saving the files, moving them to addons, then linking them to your project: http://code.google.com/p/mea3-10am333-2010/source/browse/Projekt/Code/-mea3-project/-ExtraAddons-/ofxOpenCvBlobTracking/?r=555

  3. You need to update the olliepalmer testApp.h and testApp.cpp files using what I have posted below.

  4. Bonus tip for the dazed and confused: **Once the app is running, the Tracking info can be viewed under the xCode menu RUN>Console.

*I apologize if this is is the most backwards way to making something simple happen, but it took me forever to get this to work.

Now on to configuring the code for my application…

The working testApp.h code is here:

  
  
#include "ofMain.h"  
#include "ofxCvMain.h"  
  
#define _USE_LIVE_VIDEO	//	comment this to use a movie  
  
class testApp : public ofSimpleApp, public ofxCvBlobListener {  
  
  public:  
  
    int cwidth;  
    int cheight;  
	  
#ifdef _USE_LIVE_VIDEO  
	ofVideoGrabber 		vidGrabber;  
#else  
	ofVideoPlayer 		vidPlayer;  
#endif  
	  
    ofxCvColorImage		colorImg;  
    ofxCvGrayscaleImage  grayImg;  
    ofxCvGrayscaleImage  bgImg;  
    ofxCvContourFinder	contourFinder;  
    ofxCvBlobTracker		blobTracker;  
      
	int threshold;  
	bool bLearnBakground;  
	      
    void setup();  
    void update();  
    void draw();  
  
    void keyPressed( int key );  
    void mouseMoved( int x, int y );  
    void mouseDragged( int x, int y, int button );  
    void mousePressed( int x, int y, int button );  
    void mouseReleased();  
      
    void blobOn( int x, int y, int id, int order );  
    void blobMoved( int x, int y, int id, int order );      
    void blobOff( int x, int y, int id, int order );      
          
};  
  

the testApp.cpp code is here:

  
  
#include "testApp.h"  
   
  
//--------------------------------------------------  
void testApp::setup() {  
	  
	ofSetFrameRate( 60 );  
    cwidth = 320;		// what is the video width?  
    cheight = 240;		// what is the video height?  
	threshold = 60;  
	  
	bLearnBakground = true;  
	  
#ifdef _USE_LIVE_VIDEO //	let's load the video, if not a movie. Define this in the testApp.h file  
	vidGrabber.setVerbose(true);  
	vidGrabber.initGrabber(cwidth,cheight);  
#else  
	vidPlayer.loadMovie("Movie28.mov");  
	vidPlayer.play();  
#endif  
  
	colorImg.allocate(cwidth,cheight);  
	grayImg.allocate(cwidth,cheight);  
	bgImg.allocate(cwidth,cheight);  
	  
    blobTracker.setListener( this );  
}  
  
//--------------------------------------------------  
void testApp::update() {  
	ofBackground( 36,37,41 );  
  
	bool bNewFrame = false;  
	  
#ifdef _USE_LIVE_VIDEO  
	vidGrabber.grabFrame();  
	bNewFrame = vidGrabber.isFrameNew();  
#else  
	vidPlayer.idleMovie();  
	bNewFrame = vidPlayer.isFrameNew();  
#endif  
	  
	if(bNewFrame) {  
#ifdef _USE_LIVE_VIDEO  
		colorImg.setFromPixels(vidGrabber.getPixels(), cwidth,cheight);  
		colorImg = vidGrabber.getPixels(); // normally vidGrabber  
  
#else  
		colorImg.setFromPixels(vidPlayer.getPixels(), cwidth,cheight);  
		colorImg = vidPlayer.getPixels(); // normally vidGrabber  
#endif  
		//end  
		  
        colorImg.mirror( false, true );          
        grayImg = colorImg;  
       
        if( bLearnBakground ) {  
            bgImg = grayImg;  
            bLearnBakground = false;  
        }      
  
        grayImg.absDiff( bgImg );  
        grayImg.blur( 11 );  
        grayImg.threshold( threshold );  
  
        //findContures( img, minSize, maxSize, nMax, inner contours yes/no )  
        contourFinder.findContours( grayImg,10,20000, 10, false );  
        blobTracker.trackBlobs( contourFinder.blobs );  
    }  
      
}  
  
//--------------------------------------------------  
void testApp::draw() {  
	ofSetColor( 0xffffff );  
      
    colorImg.draw(20,20);  
    grayImg.draw(20+cwidth+20+cwidth+20,20);  
	  
	ofSetColor(255, 0, 131);  
	bgImg.draw(20+cwidth+20, 20);  
    blobTracker.draw(20+cwidth+20+cwidth+20,20 );  
	  
	ofSetColor(255,0,131);  
    ofDrawBitmapString( "[space] to learn background\n[+]/[-] to adjust threshold",   
                        20,280 );  
	  
}  
  
//--------------------------------------------------  
void testApp::keyPressed( int key ) {  
    if( key == ' ' ) {  
        bLearnBakground = true;  
    } else if( key == '-' ) {  
        threshold = MAX( 0, threshold-1 );  
    } else if( key == '+' || key == '=' ) {  
        threshold = MIN( 255, threshold+1 );  
    }  
}  
  
  
//--------------------------------------------------  
void testApp::mouseMoved( int x, int y ) {}	  
void testApp::mouseDragged( int x, int y, int button ) {}  
void testApp::mousePressed( int x, int y, int button ) {}  
void testApp::mouseReleased() {}  
  
  
/*  
 *  
 *	blob section  
 *	  
 *	from here on in it's blobs  
 *	thanks to stefanix and the opencv library :)  
 *	  
 */  
  
//--------------------------------------------------  
void testApp::blobOn( int x, int y, int id, int order ) {  
    cout << "blobOn() - id:" << id << " order:" << order << endl;  
}  
  
void testApp::blobMoved( int x, int y, int id, int order) {  
    cout << "blobMoved() - id:" << id << " order:" << order << endl;  
      
    // full access to blob object ( get a reference)  
    ofxCvTrackedBlob blob = blobTracker.getById( id );  
    cout << "area: " << blob.area << endl;  
}  
  
void testApp::blobOff( int x, int y, int id, int order ) {  
    cout << "blobOff() - id:" << id << " order:" << order << endl;  
}  
  

And suddenly… after loads of glorious tracking information in my console, I quite out of xCode, re-opened each of my copies of my working project, and I now have Zero tracking information displaying in the console. It simply says:

[Session started at 2011-07-15 19:48:23 -0400.]

(unavailable) device[0] DV Video
(unavailable) device[1] DVCPRO HD (1080i50)
(unavailable) device[2] DVCPRO HD (1080i60)
(unavailable) device[3] DVCPRO HD (720p25/50)
(unavailable) device[4] DVCPRO HD (720p60)
(unavailable) device[5] Google Camera Adapter 0
(unavailable) device[6] Google Camera Adapter 1
(unavailable) device[7] IIDC FireWire Video
device[8] USB Video Class Video - Built-in iSight

OF_WARNING: No camera settings to load

feeling a bit sad after having thought I had my tracking info.

Any advice?

…ok more than a bit sad. Kinda ??? :-\ :’( and >:(

Did you select the right video device? In you case number eight ( vidGRabber.setDeviceId(8) ).

Strange though, I thought built in iSight was always the default (number 0)

Thanks underdoeg. The camera is showing the video, the tracking info in the console just disappeared.
This stuff:
blobMoved() - id:13 order:7
area: 4485
blobMoved() - id:8 order:4
area: 415

But! It miraculously began tracking data after another hour of tinkering today. Next step is to figure out how to harness the tracking info to move some animations left and right. I’m not sure how to link my animated image sequence classes to the data of moving blobs. Working on that now.

:slight_smile: