Using an RGB fbo works fine on Desktop but is shows on iOS.
RGBA works fine on both, but I want to save the image as a video frame, and iOS needs either RGB or ARGB data, not RGBA. And it needs to be fast (saving frames as real time video).
Ideally I would avoid the alpha channel to transfer 25% less data. Is it possible in iOS / GLES?
And if a shader outputs to an RGB fbo, do you still use a vec4 as gl_FragColor output?
My question was not very clear…
This example works as long as I use RGBA but with RGB the ofPixels are pure black, even if the fbo would look fine if displayed.
It seems like readToPixels() produces black pixels if using RGB on iOS (no issue on desktop). Can anyone confirm or suggest a workaround? If I could avoid the alpha channel I might save megabytes of ram
s.internalformat = GL_RGB; // >>> GL_RGBA
s.width = 400;
s.height = 400;
pix.allocate(400, 400, OF_PIXELS_RGB); // >>> OF_PIXELS_RGBA
tex.allocate(400, 400, GL_RGB); // >>> GL_RGBA
ofDrawCircle(ofRandom(400), ofRandom(400), 10);
// useless juggling just for testing the issue
tex.loadData(pix.getData(), 400, 400, GL_RGB); // >>> GL_RGBA
ofDrawCircle(200, 200, 100);