ofxQTKitVideoGrabber QTKit & CoreVideo + OF

Hello OF board,

This is follow up to my post about some artifacts I found when working with some video cameras here (http://forum.openframeworks.cc/t/video-aliasing-artifiacts/3458/0)

I’ve put together an implementation of ofVideoGrabber that runs off of the QTKit and Core Video libraries. There is no difference in use from the standard video grabber, so it should be easy to add to existing projects.

It’s OS X 10.5+ . You’ll have to add the QTKit and CoreVideo frameworks to your project to compile.

A few other perks come with using this, since it’s is GPU enabled, multi-threaded, as well as supporting more Quicktime capture codecs such as HDV over FireWire.

best,

ofxQTKitVideoGrabber.zip

Jim, thanks so much for posting this, and thanks Zach for telling me about this thread.
I’m using the Canopus ADVC-110 capture device and was terribly annoyed by the blocky artifacts from Apple’s DV compression. ofxQTKitVideoGrabber produces much sharper results.

Here’s a comparison:

-golan

Thanks obviousjim for posting this.

it helped me to get access to my USB webcam controls via this great UVC class.

http://forum.openframeworks.cc/t/uvc-camera-control-in-mac-os-x/3917/1

thx,
stephan.

hi

do you think this video grabber can be put in a separate thread?
my first try throws a bad access error ?

cheers,
stephan.

i would be interested too, but afaik this pushes a texture to the gfx-card and makes the textureID available;
so it must stay in the openGL thread, respectively the context!
that’s why you get BAD_ACCESS.

I don’t have an answer to this question.
But i have a work around. I am running the vidGrabber in a separate app and send my blob tracking results + blob image over ofxNetwork to the main app.

it’s discussed here:
http://forum.openframeworks.cc/t/sending-images-over-network-sendrawbytes/603/0

still it would be great to have the QTKitVideoGrabber run in a separate thread.

stephan.

i think i got it.

basically i duplicated what i saw in ofVideograbber.
and it seems to work.

s.

ofxQTKitVideoGrabber.h

  
/*  
 #ifndef _OFXPH_QTKITVIDEOGRABBER  
#define _OFXPH_QTKITVIDEOGRABBER  
  
#include "ofMain.h"  
  
class ofxQTKitVideoGrabber {  
  public:  
	ofxQTKitVideoGrabber();  
	~ofxQTKitVideoGrabber();  
     
	void			initGrabber(int w, int h);  
	void			grabFrame();  
	bool			isFrameNew();  
	void			update();  
  
	void 			listDevices();  
	void			close();  
	unsigned char 	* getPixels();  
	ofTexture &		getTextureReference();  
	void 			setUseTexture(bool bUse);  
	void 			setVerbose(bool bTalkToMe);  
	void			setDeviceID(int deviceID);  
	void			setDesiredFrameRate(int framerate){ ofLog(OF_LOG_WARNING, "ofxQTKitVideoGrabber -- Cannot specify framerate.");  };  
	void			videoSettings() { ofLog(OF_LOG_WARNING, "ofxQTKitVideoGrabber -- No video settings available.");  };  
	void 			draw(float x, float y, float w, float h);  
	void 			draw(float x, float y);  
	  
	float 			getHeight();  
	float 			getWidth();  
	  
  protected:  
	bool					bUseTexture;  
	  
	bool confirmInit();  
	bool isInited;  
	int deviceID;  
	void* grabber;  
};  
  
#endif  

ofxQTKitVideoGrabber.mm

  
/*  
 *  ofxQTKitVideoGrabber.cpp  
 *  
 *  Created by James George on 3/9/10.  
 *    
 *  
 */  
  
#include "ofxQTKitVideoGrabber.h"  
#import "Cocoa/Cocoa.h"  
#import "QTKit/QTKit.h"  
  
static inline void argb_to_rgb(unsigned char* src, unsigned char* dst, int numPix)  
{  
	for(int i = 0; i < numPix; i++){  
		memcpy(dst, src+1, 3);  
		src+=4;  
		dst+=3;  
	}	  
}  
  
@interface QTKitVideoGrabber : QTCaptureVideoPreviewOutput  
{  
    QTCaptureSession *session;  
	QTCaptureDeviceInput *videoDeviceInput;  
	NSInteger width, height;  
	  
	CVImageBufferRef cvFrame;  
	ofTexture* texture;  
	unsigned char* pixels;	  
  
	BOOL isRunning;  
	BOOL hasNewFrame;  
	BOOL isFrameNew;  
	  
	BOOL verbose;  
	  
	BOOL bUseTexture;  
}  
  
@property(nonatomic, readonly) NSInteger height;  
@property(nonatomic, readonly) NSInteger width;  
@property(nonatomic, retain) QTCaptureSession* session;  
@property(nonatomic, retain) QTCaptureDeviceInput* videoDeviceInput;  
@property(nonatomic, readonly) BOOL isRunning;  
@property(readonly) unsigned char* pixels;  
@property(readonly) ofTexture* texture;  
@property(readonly) BOOL isFrameNew;  
@property(nonatomic, readwrite) BOOL verbose;  
  
- (id) initWithWidth:(NSInteger)width   
			  height:(NSInteger)height   
			  device:(NSInteger)deviceID;  
  
- (void) outputVideoFrame:(CVImageBufferRef)videoFrame   
		 withSampleBuffer:(QTSampleBuffer *)sampleBuffer   
		   fromConnection:(QTCaptureConnection *)connection;  
  
- (void) update;  
  
- (void) stop;  
  
- (void) listDevices;  
  
  
@end  
  
  
@implementation QTKitVideoGrabber  
@synthesize width, height;  
@synthesize session;  
@synthesize videoDeviceInput;  
@synthesize pixels;  
@synthesize texture;  
@synthesize isFrameNew;  
@synthesize verbose;  
  
- (id) initWithWidth:(NSInteger)_width height:(NSInteger)_height device:(NSInteger)deviceID  
{  
	if(self = [super init]){  
		//configure self  
		width = _width;  
		height = _height;		  
		[self setPixelBufferAttributes: [NSDictionary dictionaryWithObjectsAndKeys:   
										 [NSNumber numberWithInt: kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey,  
										 [NSNumber numberWithInt:width], kCVPixelBufferWidthKey,   
										 [NSNumber numberWithInt:height], kCVPixelBufferHeightKey,   
										 [NSNumber numberWithBool:YES], kCVPixelBufferOpenGLCompatibilityKey,  
										nil]];	  
		  
		//instance variables  
		cvFrame = NULL;  
		hasNewFrame = false;  
		if (bUseTexture){  
			texture = new ofTexture();  
			texture->allocate(_width, _height, GL_RGB);  
		}  
		pixels = (unsigned char*)calloc(sizeof(char), _width*_height*3);  
		  
		//set up device  
		NSArray* videoDevices = [[QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo]   
						 arrayByAddingObjectsFromArray:[QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed]];  
		  
		if(verbose) ofLog(OF_LOG_VERBOSE, "ofxQTKitVideoGrabber -- Device List:  %s", [[videoDevices description] cString]);  
			  
		NSError *error = nil;  
		BOOL success;  
		  
		//start the session  
		self.session = [[QTCaptureSession alloc] init];  
		success = [self.session addOutput:self error:&error];  
		if( !success ){  
			ofLog(OF_LOG_ERROR, "ofxQTKitVideoGrabber - ERROR - Error adding output");  
			return nil;  
		}  
  
		// Try to open the new device  
		if(deviceID >= videoDevices.count){  
			ofLog(OF_LOG_ERROR, "ofxQTKitVideoGrabber - ERROR - Error selected a nonexistent device");  
			deviceID = videoDevices.count - 1;  
		}  
		  
		QTCaptureDevice* selectedVideoDevice = [videoDevices objectAtIndex:deviceID];  
		success = [selectedVideoDevice open:&error];  
		if (selectedVideoDevice == nil || !success) {  
			ofLog(OF_LOG_ERROR, "ofxQTKitVideoGrabber - ERROR - Selected device not opened");  
			return nil;  
		}  
		else {   
			if(verbose) ofLog(OF_LOG_VERBOSE, "ofxQTKitVideoGrabber -- Attached camera %s", [[selectedVideoDevice description] cString]);  
			  
			// Add the selected device to the session  
			videoDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice:selectedVideoDevice];  
			success = [session addInput:videoDeviceInput error:&error];  
			if(!success) ofLog(OF_LOG_ERROR, "ofxQTKitVideoGrabber - ERROR - Error adding device to session");	  
  
			//start the session  
			[session startRunning];  
		}  
	}  
	return self;  
}  
  
//Frame from the camera  
//this tends to be fired on a different thread, so keep the work really minimal  
- (void) outputVideoFrame:(CVImageBufferRef)videoFrame   
		 withSampleBuffer:(QTSampleBuffer *)sampleBuffer   
		   fromConnection:(QTCaptureConnection *)connection  
{  
	CVImageBufferRef toRelease = cvFrame;  
	CVBufferRetain(videoFrame);  
	@synchronized(self){  
		cvFrame = videoFrame;  
		hasNewFrame = YES;  
	}	  
	if(toRelease != NULL){  
		CVBufferRelease(toRelease);  
	}  
}  
  
- (void) update  
{  
	@synchronized(self){  
		if(hasNewFrame){  
			CVPixelBufferLockBaseAddress(cvFrame, 0);  
			unsigned char* src = (unsigned char*)CVPixelBufferGetBaseAddress(cvFrame);;  
			  
			//I wish this weren't necessary, but  
			//in my tests the only performant & reliabile  
			//pixel format for QTCapture is k32ARGBPixelFormat,   
			//to my knowledge there is only RGBA format  
			//available to gl textures  
			  
			//convert pixels from ARGB to RGB			  
			argb_to_rgb(src, pixels, width*height);  
		if (bUseTexture){  
			texture->loadData(pixels, width, height, GL_RGB);  
		}  
			CVPixelBufferUnlockBaseAddress(cvFrame, 0);  
			hasNewFrame = NO;  
			isFrameNew = YES;  
		}  
		else{  
			isFrameNew = NO;  
		}  
	}	  
}  
  
- (void) stop  
{  
	if(self.isRunning){  
		[self.session stopRunning];  
	}	  
	  
	self.session = nil;  
	  
	free(pixels);  
	delete texture;  
}  
  
  
- (BOOL) isRunning  
{  
	return self.session && self.session.isRunning;  
}  
  
- (void) listDevices  
{  
	NSLog(@"ofxQTKitVideoGrabber devices %@",   
		  [[QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeVideo]   
				arrayByAddingObjectsFromArray:[QTCaptureDevice inputDevicesWithMediaType:QTMediaTypeMuxed]]);  
	  
}  
  
@end  
  
  
ofxQTKitVideoGrabber::ofxQTKitVideoGrabber()  
{  
	deviceID = 0;  
	grabber = NULL;  
	isInited = false;  
	bUseTexture				= true;  
}  
  
ofxQTKitVideoGrabber::~ofxQTKitVideoGrabber()  
{  
	if(isInited){  
		close();  
	}  
}  
  
void ofxQTKitVideoGrabber::setDeviceID(int _deviceID)  
{  
	deviceID = _deviceID;  
	if(isInited){  
		//reinit if we are running...  
		//should be able to hot swap, but this is easier for now.  
		int width  = ((QTKitVideoGrabber*)grabber).width;  
		int height = ((QTKitVideoGrabber*)grabber).height;  
		  
		close();  
		  
		initGrabber(width, height);  
	}  
}  
  
void ofxQTKitVideoGrabber::initGrabber(int w, int h)  
{  
	NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];  
	grabber = [[QTKitVideoGrabber alloc] initWithWidth:w height:h device:deviceID];  
	  
	isInited = (grabber != nil);  
	  
	[pool release];	  
}  
  
void ofxQTKitVideoGrabber::update(){   
	grabFrame();   
}  
  
void ofxQTKitVideoGrabber::grabFrame()  
{  
	if(confirmInit()){  
		[(QTKitVideoGrabber*)grabber update];  
	}  
}  
  
bool ofxQTKitVideoGrabber::isFrameNew()  
{  
	return isInited && [(QTKitVideoGrabber*)grabber isFrameNew];  
}  
  
void ofxQTKitVideoGrabber::listDevices()  
{  
	if(confirmInit()){  
		[(QTKitVideoGrabber*)grabber listDevices];  
	}  
}  
  
void ofxQTKitVideoGrabber::close()  
{  
	  
	[(QTKitVideoGrabber*)grabber stop];  
	[(QTKitVideoGrabber*)grabber release];  
	isInited = false;	  
}  
  
unsigned char* ofxQTKitVideoGrabber::getPixels()  
{  
	if(confirmInit()){  
		return [(QTKitVideoGrabber*)grabber pixels];  
	}  
	return NULL;  
}  
  
//------------------------------------  
void ofxQTKitVideoGrabber::setUseTexture(bool bUse){  
	bUseTexture = bUse;  
}  
  
ofTexture &	ofxQTKitVideoGrabber::getTextureReference()  
{  
	if(confirmInit()){  
		return *[(QTKitVideoGrabber*)grabber texture];  
	}  
}  
  
void ofxQTKitVideoGrabber::setVerbose(bool bTalkToMe)  
{  
	if(confirmInit()){  
		((QTKitVideoGrabber*)grabber).verbose = bTalkToMe;  
	}  
}  
  
void ofxQTKitVideoGrabber::draw(float x, float y, float w, float h)  
{  
	if(confirmInit()){  
		if (bUseTexture){  
		[(QTKitVideoGrabber*)grabber texture]->draw(x, y, w, h);  
		}  
	}  
}  
  
void ofxQTKitVideoGrabber::draw(float x, float y)  
{  
	if(confirmInit()){  
		if (bUseTexture){  
			[(QTKitVideoGrabber*)grabber texture]->draw(x, y);  
		}  
	}  
}  
  
float ofxQTKitVideoGrabber::getHeight()  
{  
	if(confirmInit()){  
		return (float)((QTKitVideoGrabber*)grabber).height;  
	}  
	return 0;  
}  
  
float ofxQTKitVideoGrabber::getWidth()  
{  
	if(confirmInit()){  
		return (float)((QTKitVideoGrabber*)grabber).width;  
	}  
	return 0;  
	  
}  
		    
bool ofxQTKitVideoGrabber::confirmInit()  
{  
	if(!isInited){  
		ofLog(OF_LOG_ERROR, "ofxQTKitVideoGrabber -- ERROR -- Calling method on non intialized video grabber");  
	}  
	return isInited;  
}  
  

Hi I have been trying to change the video grabber example to work with QtKit video grabber.

I am not experienced with open frameworks but am guessing only, appologies in advance for a stupid question but here is my code

  
#ifndef _TEST_APP  
#define _TEST_APP  
  
  
#include "ofMain.h"  
  
class testApp : public ofBaseApp{  
	  
	public:  
		  
		void setup();  
		void update();  
		void draw();  
		  
		void keyPressed(int key);  
		void keyReleased(int key);  
		void mouseMoved(int x, int y );  
		void mouseDragged(int x, int y, int button);  
		void mousePressed(int x, int y, int button);  
		void mouseReleased(int x, int y, int button);  
		void windowResized(int w, int h);  
		  
		ofVideoGrabber 		vidGrabber;  
		unsigned char * 	videoInverted;  
		ofTexture			videoTexture;  
		int 				camWidth;  
		int 				camHeight;  
};  

And then

  
#include "testApp.h"  
#include "ofxQTKitVideoGrabber.h"  
  
  
  
  
  
//--------------------------------------------------------------  
void testApp::setup(){	   
	  
	camWidth 		= 320;	// try to grab at this size.   
	camHeight 		= 240;  
  
	ofxQTKitVideoGrabber().setVerbose(true);  
	ofxQTKitVideoGrabber().initGrabber(camWidth,camHeight);  
	  
	videoInverted 	= new unsigned char[camWidth*camHeight*3];  
	videoTexture.allocate(camWidth,camHeight, GL_RGB);  
	  
}  
  
  
//--------------------------------------------------------------  
void testApp::update(){  
	  
	ofBackground(100,100,100);  
	  
	ofxQTKitVideoGrabber().grabFrame();  
	  
	if (ofxQTKitVideoGrabber().isFrameNew()){  
		int totalPixels = camWidth*camHeight*3;  
		unsigned char * pixels = vidGrabber.getPixels();  
		for (int i = 0; i < totalPixels; i++){  
			videoInverted[i] = 255 - pixels[i];  
		}  
		videoTexture.loadData(videoInverted, camWidth,camHeight, GL_RGB);  
	}  
  
}  
  
//--------------------------------------------------------------  
void testApp::draw(){  
	ofSetColor(0xffffff);  
	ofxQTKitVideoGrabber().draw(20,20);  
	videoTexture.draw(20+camWidth,20,camWidth,camHeight);  
}  
  
  
//--------------------------------------------------------------  
void testApp::keyPressed  (int key){   
	  
	// in fullscreen mode, on a pc at least, the   
	// first time video settings the come up  
	// they come up *under* the fullscreen window  
	// use alt-tab to navigate to the settings  
	// window. we are working on a fix for this...  
	  
	if (key == 's' || key == 'S'){  
		ofxQTKitVideoGrabber().videoSettings();  
	}  
	  
	  
}  
  

If anyone can point me in the right direction it would be great.

Cheers

here is what works for me on osx 10.6 with of 0.61

www.lozano-hemmer.com/ss/opencvExample3qtkit+focus.zip

good luck.

Thanks a lot for your reply, but the link does not seem to work.

Cheers

Thanks so much the link came to life eventually. It was easier than I thought. I managed to get input form and HDV source but the latency was huge, does anyone know of how to reduce the latency?

Cheers

Hello

So I’m trying to use the ofxQTKitVideoGrabber and ofxQTKitVideoRecorder to record some videos with synchronized audio.
I started by simply running the movieGrabberExample and substituting the ofVideoGrabber for the ofxQTKitVideoGrabber. It compiles fine, but when I try to initialize de video device I get the folllowing error:

ofxQTKitVideoGrabber – ERROR – Calling method on non intialized video Recorder

I can see where the error is being reported from: ofxQTKitVideoGrabber::confirmInit() function

I’m compiling with the Mac OSX 5 SDK,
running OSX 10.5.8,
x-code 3
and added the CoreVideo and QTKit frameworks to the project.

What am I missing?

Any Ideas?

Any help would be great

Mauro

Howdy obviousjim and fellow OF hackers: thanks for all the great work on these addons.

I extended ofxQTKitVideoGrabber to allow sync’d video & audio recording.

More info and code at topic,6390.msg30477.html

Oh wow, thank you!
I was having the horrible blocky artifacts with an advc55.
This just made my day. Thanks!