Hi all, I’m working on an iOS app that uses the camera for video and I am trying to manually lock the exposure and the white balance. I could easily do so (see AvCam example) if I could reach the AVCaptureDevice within AVFoundationVideoGrabber, but I see no way of doing so.
I could subclass ofiPhoneVideoGrabber and expose some custom methods, instantiating myself a ofiPhoneVideoGrabber, but then I would lose all the ofVideoGrabber goodness… what’s the best way to do this elegantly?
There’s even a method already implemented on AVFoundationVideoGrabber “lockExposureAndFocus”, but again, it seems to be unreachable…
Which would allow you to get at the VideoGrabber to do the lock method – but it needs a bunch more work. I hope to dive into it soon but in the mean time it may be another lead…
hi all,
i know, this is an old thread but it covers somehow what i want to do:
For a specific project, I would like to extend the AVFoundationVideoGrabber to read qrcodes through the AVMetadataObject to get hold of AVMetadataMachineReadableCodeObject.
So I have to extend somehow the AVFoundationVideoGrabber with a function like:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
}
(https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVMetadataObject_Class/index.html#//apple_ref/occ/cl/AVMetadataObject)
but i don’t get how to do this with all these wrapper layers above AVFoundationVideoGrabber.
So my question would be:
Is there is a certain designpattern to overwrite some core classes just in a specific project? Maybe some sort of cascading file system or so?
if not, how to implement best a new function in AVFoundationVideoGrabber so that i can reach it directly? Or do I have to expose such a function in all the layers (ofxiOSVideoGrabber, ofBaseVideoGrabber) above? What’s the correct OF Way to do something that specific only for iOs?