Hi All!
I’ve been having a problem with what appears to be a lag or maybe buffering in ofVideoGrabber and would really appreciate a bit of help.
THE BACKGROUND:
we have an app where the relationship between whats shown on screen and what the cam records at any one moment is really important. I need to know that when I am showing frame x to the screen at time t, the grab at time t shows exactly frame x. as such the ofqtvideosaver isn’t much use because it occasionally drops frames.
HOW WE ARE GRABBING
We are trying to grab quickly while keeping that relationship between screen and camera intact so we push into a vector of unsigned chars each frame and then write those to a file at the end. Like this:
//show things to the screen here
unsigned char * someLocalPixels = new unsigned char[camWidth*camHeight*3];
memcpy(someLocalPixels, pixels, (camWidth*camHeight*3));
imgPixels.push_back(someLocalPixels);
for (i = 0; i < imgPixels.size(); i++)
{
//cout<<i<<"< i in COLOR_SINGLE ** frames being written to images \n";
//cout << i/framesPerGreyValue << " frameCounter/framesPerGreyValue \n";
//cout << i%framesPerGreyValue << " frameCounter%framesPerGreyValue \n";
cameraCapture.setFromPixels(imgPixels_, camWidth, camHeight, OF_IMAGE_COLOR, true);
cameraCapture.saveImage(whichAnalysis+"_"+ofToString(i-latency)+"_"+ofToString(i)+".jpg");
}
[i][i]THE PROBLEM
The problem is that we when we look at the images saved out there are about 3 frames at the beginning which have clearly been recorded BEFORE we started showing things to the screen.
HELP? :’(
Can anyone tell me if the videoGrabber buffers a few frames? Ie when we call getPixels() on the grabber object is it definitely the pixels from the camera image which was captured this frame or a couple of frames previously?
I really hope that’s clear enough to understand.
we are using oF007 and running OSX 10.7 lion on a macbookpro i7 2.2
thanks everyone.
_[/i]