I want to save screen image oh iphone.
the image file was created, but it is just black screen.
do you know what I need? initialization or something?

iOS 4.2
xcode 3.2.4
openframeworks 0062


ofImage Img;  


Img.grabScreen( 0, 0, ofGetWidth(), ofGetHeight() );  
Img.saveImage( ofxiPhoneGetDocumentsDirectory() + "test.png" );	  


Thanks your reply, I checked but it’s same before…

Hmm, that’s strange. ofxiPhoneScreenGrab uses glReadPixels to grab the screen. Are you using a modified version of OF? Maybe you could post your code for others to look at?

Hi there,

yes ofxiPhoneScreenGrab() works for me.
thanks astellato :slight_smile:

For next step, i want to do

  • save screen data inside .app(not in the album),
  • play with saved screen pixels.

I tried ofxFBOTexture.getPixel(), ofImage.grabScreen(), ofTexture.loadScreenData(),
but none of them seems works.

So I wrote small function following astellato’s advice.
Named ofxiPhoneGrabScreenPix which returns raw screen pixel.

on ofxiPhoneExtras.h

unsigned char* ofxiPhoneGrabScreenPix(id delegate);  


unsigned char* ofxiPhoneGrabScreenPix(id delegate) {  
	CGRect rect = [[UIScreen mainScreen] bounds];  
	int width = ofGetWidth();  
	int height =  ofGetHeight();	  
	NSInteger myDataLength = width * height * 4;  
	unsigned char *buffer = (GLubyte *) malloc(myDataLength);  
	unsigned char *bufferFlipped = (GLubyte *) malloc(myDataLength);  
	glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);  
	for(int y = 0; y <height; y++) {  
		for(int x = 0; x <width * 4; x++) {  
			bufferFlipped[int((height - 1 - y) * width * 4 + x)] = buffer[int(y * 4 * width + x)];  
	free(buffer);	// free original buffer  
	return bufferFlipped;  

then in your testApp

img.setFromPixels(ofxiPhoneGrabScreenPix(ofxiPhoneGetAppDelegate()), ofGetWidth(), ofGetHeight(), OF_IMAGE_COLOR_ALPHA);  

I know this is tricky(because i chagend core oF),
so I appreciate any other smart workaround :slight_smile:

this is sample project

I’m using 007 iphone+oF static lib grabbed from here
Thank you Apex!


Your code works well on openframeworks 0062, thanks for your sample.

but it is better to modify a little.
because I need to modify a little for following setting.


anyway, thanks again!


Does anyone have a working example in 007? Im using the code from the first post but nothing happens, not even an error message and no image at all.


  • rS

did you try astellato’s suggestion in the second post? i’m using 007 and it works for me: the screen is grabbed and saved to the photo album. fwiw, the code from the first post did not work for me in terms of saving to the photo album; i’m guessing it is saving it elsewhere, if at all.

Not sure where to put that piece of code, any hints?

Never mind found my answer here

it looks like ofImage.grabScreen() fails on the iPhone when you haven’t allocated it yet.

myImage.allocate(ofGetWidth(), ofGetHeight(), OF_IMAGE_COLOR_ALPHA);  
myImage.grabScreen(0, 0, ofGetWidth(), ofGetHeight());  
ofSetColor(255, 255, 255);  
myImage.draw(0, 0);  

also - note that the image should be allocated using OF_IMAGE_COLOR_ALPHA.

not sure if this is still a problem in 007

This method does not work with anti-aliasing turned on like this:

I’ve been battling this, too. There are a few GL functions that aren’t supported in GLES with multisampling / antialiasing, namely glReadPixels. I’ve found a couple of leads on binding an additional framebuffer without multisampling, but haven’t had any success yet – only black frames.

If anyone has a solution let me know!


i also desperately need to take a snapshot of a multi sampled renderer. after hours of searching the net for solutions i give up.

anyone lucky so far?

here’s what i tried:

having the exact same issue, but I’m not worried about antialiasing. I have the app set up to use retina display, but when i use this:

        ofxiPhoneAppDelegate * delegate = ofxiPhoneGetAppDelegate();    

It had been taking an image of just 1/4 of the entire screen for some reason. My quick fix was to go to the ofxiPhoneScreenGrab method and multiply the width/height ints by 2 and it seems to come out correctly now…but there must be a more elegant way to do it than messing with the core

Lines I modified in ofxiPhoneScreenGrab:

void ofxiPhoneScreenGrab(id delegate) {  
	CGRect rect = [[UIScreen mainScreen] bounds];  
	int width = rect.size.width*2; //CHA-CHA-CHA-CHANGESSS  
	int height =  rect.size.height*2; //CHA-CHA-CHA-CHANGESSS  
	NSInteger myDataLength = width * height * 4;  

Also, does anyone know what decides the orientation of the taken screenshot? all of mine keep being portrait but I’m trying to get them to be landscape

Are you on a retina screen? Maybe the height and width needs to be x 2 since you’re on retina?

@Seth - right, my device does have a retina display but I don’t understand why it thinks it has a smaller window than it actually has. For my glsetup in it’s (iOsWindow,960,480,OF_FULLSCREEN) but I dont think i follow why it need to be doubled in that screengrab method

What happens if you put this in your

  ofAppiPhoneWindow * iOSWindow = new ofAppiPhoneWindow();      
    ofSetupOpenGL(iOSWindow, 480, 320, OF_FULLSCREEN);          
    ofRunApp(new testApp);    

edit — laserpilot - this is how i solved the resolution isse… note my comments @JRW

opps i responded too quickly and didn’t realize you now want to adjust the orientation…

void ofxiPhoneScreenGrab(id delegate) {  
	CGRect rect = [[UIScreen mainScreen] bounds];  
    //JRW - Retina Fix for PhotoGrab  
    if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)] == YES){  
        float f_scale = [[UIScreen mainScreen] scale];  
        rect.size.width *= f_scale;  
        rect.size.height *= f_scale;  
	int width = rect.size.width;  
	int height =  rect.size.height;  

why not doing it like that?

void ofxiPhoneScreenGrab(id delegate) {  
    int width = ofGetWidth();  
    int height = ofGetHeight();  

@jasonwalters your solution works for me thanks a lot