How to correctly pass a texture to a vertex shader

So, i ´ve been playing arround with the vboMeshDrawInstanceExample from the “gl” folder.

Im completly noob to vertex shaders.

Im looking to extrude the Z value from a texture that i passed.

If you look to the code :

// extrude our primitive along the z coordinate, based on current PixelDepth.
vPos.z = (vPos.z -5.0) * pixelDepth * 500.0+t2.r*5000.0 ;
// this will pull pur vertices apart by four time their original coordinate values,
// then move them by an snoise value which is the same for every primitive
// simplex noise makes things look a little more organic.
vPos.x = vPos.x * 8.0 ;
vPos.y = vPos.y * 8.0 + snoise(vec2(1.0-instanceX,instanceY)) * 0.0;
// this will distribute our boxes in space,

You can see that the function snoise its been used for adding the different Z values to the rectangles.
I want like the same effect but using a texture that i´ve input (so I can make it from the kinect or other sensors).

Is it posible? Is there a correct way to map a texture to the individual instances?

Hi @Julian_Puppo,

I believe that you have missed it maybe :
line 33, the depth texture is loaded in the vertex shader :

uniform	sampler2D tex0;			// we use this to sample depth data for our boxes.

line 104 in ofApp.cpp::draw() :

	mShdInstanced->setUniformTexture("tex0", mTexDepth, 0);

this is how you send a texture to a shader.

Hope this helps,



Sorry for not posting full code, i ´ve was already trying what you were saying without succes

This is the OF part where process the shader :

// we don't care about alpha blending in this example, and by default alpha blending is on in openFrameworks > 0.8.0
// so we de-activate it for now.
ofBackgroundGradient(ofColor(18,33,54), ofColor(18,22,28));
// bind the shader
// give the shader access to our texture
mShdInstanced->setUniformTexture("tex0", mTexDepth, 0);
mShdInstanced->setUniformTexture("tex1", spoutReceiver.getTexture(), 1);
mShdInstanced->setUniform2f("resolution", ofGetWidth(),ofGetHeight());
// feed the shader a normalized float value that changes over time, to animate things a little
mShdInstanced->setUniform1f("timeValue", (ofGetElapsedTimeMillis() % 30000) / 30000.0f);
// we only want to see triangles facing the camera.
// let's draw 128 * 128 == 16384 boxes !
mVboBox.drawInstanced(OF_MESH_FILL, 128*128);
ofDrawBitmapString("Use mouse to move camera.\nPress 'f' to toggle fullscreen;\nSPACEBAR to reload shader.", 10, 20);

I´ve have experienced working with SAMPLERS but using fragment shaders.

This is how I do it on a fragment shader :

vec4 t1 =  texture2DRect(textura1, gl_FragCoord.xy);

But as it seems obvius, the way to correctly map the X and Y textures in a vertex shader differs from do in it in a fragment.

Im not having any error, BUT what is happening is that its seems like “t2” wich is the vec4 where i store the texture does not have any information on it, like if everything was 0.

Hi @Julian_Puppo,

Ah, Did you try using textures with the same sizes?
Maybe you could try to map textures which are the same sizes to see if it helps.

I recon maybe your problem could also come from the fact that the sampling in texture2DRect has to be done relatively to the size of the texture.

In the VbomeshDrawInstanceExample, the depth image size is 128.
Therefore there is a mod to make sure the boundaries are respected.

Hope this helps, although not sure i’m clear on where your problem comes from.