Best way to implement simple motion tracking

I think I’ve got an easy question. I want to use OpenCV to tell me when something moves. I don’t need to know where they moved as nothing will be displayed (just queuing up videos). I played around with the OpenCV example and I can do it with contourFinder.nblobs, whenever the blob number changes, but I don’t think that’s really what I want.

I think the most reliable way would be to capture an image, and continually compare the last one pixel by pixel (with a little leeway), then update it as the new image if it did change. I’m not sure exactly how to do that though or if there is a better way. Any advice would be appreciated.



hi Clay,

if you’re using the openCvExample, then you’ve got it right there already. when your image is constant, the background subtraction will leave nothing, so there’ll be no blobs. but when there’s motion, some blobs will be created in the difference image.

so to check for motion, just test if nBlobs > 0.

Oh right, I can just keep recalibrate the camera when there’s movement. Awesome, thanks!

If you do want to simply compare current frame to previous and check with threshold you can do:

greyCurDiff.absDiff(greyPrev, greyNow);	// curDiff is the difference between current and previous frame  
cvThreshold(greyCurDiff.getCvImage(), greyCurDiff.getCvImage(), diffThreshold, 255, CV_THRESH_TOZERO); // anything below diffThreshold, drop to zero (compensate for noise)  
int numPixelsChanged = greyCurDiff.countNonZeroInRegion(0, 0, width, height); // this is how many pixels have changed that are above diffThreshold  
greyPrev = greyNow; // save current frame for next loop