overlaying iphone's camera

did you notice any significant framerate drop when you did this?

I was playing around with this exact same thing and my framerate went through the floor when I made these changes.

thanks for sharing!

we actually implemented something like this in the most recent imagePicker:


showing any sort of UI element will destroy your framerate on a pre-3GS phone. Usually when I use something like this I set my framerate super low (or to 0) and then call ofSetFramerate(30) when the picture is actually taken.

in the imagePicker:

I have an OverlayView that if it gets tapped it calls the takePicture: function, where you could set the framerate back to 30. if you want to programmatically take photos, probably the best thing to do would be to create another timer that calls a function in your openFrameworks app, and is killed when you tap the overlay (or kills itself depending on what its functionality is). Totally non-intuative, but running the openGL draw loop just murders the framerate.

hope that helps :confused:

@jonbro: i was testing on a 3gs, it wasn’t that bad

@zack: i’ll try the new code! next thing i want to try is to somehow work on the pixels of the preview without actually take the photo. I have the feeling that will actually kill the performance :slight_smile:

cool :slight_smile:

easy enough!
just call [camera].takePicture();

also i think jonbro was working on a way to grab the pixels straight from the preview without asking the phone to take a photo… in some thread around here

yeah, I will push my code soon, it is pretty much what you have here. The problem is that it is not actually grabbing the camera pixels, just the contents of the last frame… so if you want a clean frame, you need to turn off all your opengl first.

unfortunately, it doesn’t really run as fast as I was expecting it to… I am actually not sure it runs any faster than actually taking the pictures. Maybe one of you with a 3gs can tell me though.

I’ll experiment a bit.

I know that the “standard way” people has been using with apple’s blessing to access camera’s preview is using UIGetScreenImage, but that’s not exactly the best way if you want work on the pixels in real time… well, we’ll see what we can do :slight_smile:


I have been using this for a few days and I have a few questions.

  1. Saving images doesn’t work and I did some googling and found this:


Does it help? It seems that the UIImageWriteToSavedPhotosAlbum should reference self and not nil.

  1. I have been using the library instead of the camera and when pictures are returned they are always upside down. I am able to call getImageOrientation() to check the orientation but I can’t find a way to invoke scaleAndRotateImage() to rotate the image if necessary. Any ideas?

  2. I have downloaded memo’s git repository that contains a compiled library. Are there instructions about how to compile that library file? I was thinking of making the change in 1. and trying it out myself but I am an xcode and openframeworks newbie.

This has greatly simplified iPhone development for me. I got my app up and almost perfect in a few days. Thanks for the help.

I have been using the ofxiPhoneImagePicker and I have found that after taking three or four photos in succession the example app would crash. I was able to trace the problem back the line in the openCamera function were the UIImagePicker is initialized again, this is there to refresh the camera so that it doesn’t have the “retake” and “use” buttons when you open the camera again.

I tried setting the _imagePicker.sourceType to NULL and then setting it back to UIImagePickerControllerSourceTypeCamera and it worked, no more crashing and no need to re-init. It also worked to set the source type to library and then back to camera.


- (bool) openCamera  
		//[_imagePicker init]; //needs this to refresh the camera.   
		_imagePicker.sourceType = NULL; //  Lars  
		_imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;  
		[[[UIApplication sharedApplication] keyWindow] addSubview:_imagePicker.view];  
		return true;  
		return false;  

i never paid close attention to the size of the photos that the iphone’s camera takes (it turns out to be 1536 x 2048).

however, i did notice that when i load an image into oF using ofxiPhoneImagePicker and set the max dimensions to 480, the image gets scaled to 360 x 480 - not 320 480 which is the size of the screen (in other words, never use the screen width to access pixels from the image ;-)).

which leads me to wonder why the photos aren’t taken at the same aspect ratio as the screen. any ideas?