Image processing at Retina density

I’m using a Mac with a Retina screen. I’m trying to process a bunch of images using OpenGL shaders and save them.

I create a 100x100 window, load my image, run it through the OpenGL shader, and save it to disk using

ofPixels img;
img.grabScreen(0, 0 , ofGetWidth(), ofGetHeight());"screenshot-" + to_string(frame) + ".png");

The output is a 100 x 100px image, as expected. However, if I take a screenshot manually, I get a 200x200 px image saved, which I understand is due to my Retina screen. I was trying to figure out a way to get to the 200 x 200 px image through OpenFrameworks directly. How can one get to the upscaled data through code?

Sounds like you have to add “High Resolution Capable” set to YES in the .plist file, listed in Xcode.
Now 100x100px in OF is 100x100px in your screenshot.

1 Like