Texture Map a Rectangle

void ofApp::setup(){


void ofApp::draw(){  
  ofBackground(50, 50, 50);

  ofRect(0, 0, 100, 100);

  img.draw(100, 0, 100, 100);

The image appears where I use image.draw(), but nothing appears in the drawn rectangle. I feel like I’ve tried every combination of ofEnableArbTex(), ofEnableNormalizedTexCoords(), and every other texture function I could try… so what am I doing wrong?


I couldn’t get it to work, but I think that you can only bind a texture to a mesh, and not a primitive.

ofRect has no texture coordinates, it’s just a rectangle. You can use ofMesh, like this:

ofMesh mesh;

mesh.addVertex( ofPoint(x,y) );
mesh.addTexCoord( myTexture.getCoordFromPercent(0,0) );

mesh.addVertex( ofPoint(x+w,y) );
mesh.addTexCoord( myTexture.getCoordFromPercent(1,0) );

mesh.addVertex( ofPoint(x,y+h) );
mesh.addTexCoord( myTexture.getCoordFromPercent(0,1) );

mesh.addVertex( ofPoint(x+w,y+h) );
mesh.addTexCoord( myTexture.getCoordFromPercent(1,1) );


note the texutre coordinates are different if you are using a power of 2 texture or a rectangular texture. I usually use getCoordFromPercent to help, you can pass in a percent and it returns the right kind of coordinates.

hope this helps,

Does ofShader not need texture coordinates? In the past I was able to use


and got the result I expected, so I imagined that bind() worked exactly the same.



if you pass a texture to the shader (via a uniform) you can skip the bind / unbind. I think without that opengl will not know which texture is active / etc.

Oh, yeah – I meant applying the shader’s output texture to vertices (like the ofRect). Every time I’ve used a shader, I’ve just drawn an ofRect after shader.begin() and everything worked as expected, so I assume that ofShader has something clever built-in so I don’t have to manually input texture coordinates?

oh – well a fragment shader effects everything that gets drawn, don’t think about it like a texture – fragments shaders are applied to every single pixel as it’s heading towards the screen. Textures is one of the things that opengl does during the rasterization, and it either can be done by specifying the coordinates (as per above) or in a shader by passing the texture through as a uniform. If neither of those things are done, there’s no way to see an image across geometry. If you do it in the shader, than you can computationally manipulate the pixels from the texture, for example, to perform a blur or some sort of other operation. on the other hand, fragment shaders if they are turned on run across every single things that is drawn, even if it’s not geometry / doesn’t have texture coordinates, etc. I hope that makes sense / helps…

Huh! That’s true, though for some reason that didn’t click.

Thanks so much Zach; I appreciate you walking me through this :slight_smile: