I’m jumping back and forth between the openFrameworks shaders tutorial and the GLSL wiki in order to get started with shaders. I’m running into some trouble with ofTexture
though - specifically with the texture getting inverted. I’m trying to grab the screen (post-shader) and save it to an ofTexture
, which then gets fed into the shader on the next draw()
call. (For context, it’s just conway’s game of life run on the GPU, so the ofTexture
contains the current step of the simulation.)
I’ve stripped down my code to find that the issue results from using loadScreenData(...)
after bind()
. So, if I have an ofTexture tex
and an ofMesh
that are set up like this:
img.loadImage("cat.jpg"); // This is just a test image
tex = img.getTextureReference();
int w = ofGetWidth();
int h = ofGetHeight();
// Mesh that covers the screen
quad.setMode(OF_PRIMITIVE_TRIANGLES);
quad.addVertex(ofVec2f(0, 0));
quad.addVertex(ofVec2f(w, 0));
quad.addVertex(ofVec2f(w, h));
quad.addVertex(ofVec2f(0, h));
ofIndexType indicies[6] = {0, 1, 2, 2, 0, 3};
quad.addIndices(indicies, 6);
quad.addTexCoord(ofVec2f(0, 0));
quad.addTexCoord(ofVec2f(w, 0));
quad.addTexCoord(ofVec2f(w, h));
quad.addTexCoord(ofVec2f(0, h));
And I have a draw()
that looks like this:
tex.bind();
quad.draw();
tex.unbind();
tex.loadScreenData(0, 0, tex.getWidth(), tex.getHeight());
I end up with the texture being flipped vertically on every frame. I’m guessing something is happening it different between the UV coordinate systems when I draw quad
and when I loadScreenData(...)
?
System info: Windows 7 (64x), Code::Blocks 12.11, 0.8.0, openGL supported up through 4.3