Getting int32 into shader

Hi.

i have an array of 32bit integers with which i want to draw objects (each value one object)

i basically do it like in the textureBufferInstancedExample.
instead of using GL_RGBA32F as format for the texture i use GL_RGBA8, and create the 32bit-value out of the RGBA components.

here the vertex-shader i user:

#version 150

uniform mat4 modelViewProjectionMatrix;
in vec4 position;
uniform samplerBuffer tex;
out vec4 color;

void main(){
    int id = gl_InstanceID;
    
    vec4 val = texelFetch(tex, id);
    
    int r = int((val.x * 255.0));
    int g = int((val.y * 255.0));
    int b = int((val.z * 255.0));
    int a = int((val.w * 255.0));

    float x = ((a<<24) + (b<<16) + (g<<8) + r);
    
    vec4 vPos = position;
    vPos.x += x;
    
    color = vec4(1.0, 1.0, 0.5, 1.0);
    gl_Position = modelViewProjectionMatrix * vPos;
}

that looks too complicated.

i tried to use GL_LUMINANCE32UI_EXT but then don’t get values:

vec4 val = texelFetch(tex, id);

int r = val.x;

how would i get those 32bit-integers effectively into the shader?

thanks for any advice.
cheers
inx

GL_LUMINANCE doesn’t exist anymore in opengl 3+ GL_R32I for int32_t or GL_R32UI for uint32_t should work

ah, ok. that explains why it would not work with GL_LUMINANCE.

using GL_R32UI gives me:

[ error ] ofGLUtils: ofGetGLFormatFromInternal(): unknown internal format 33334, returning GL_RGBA
[ error ] ofGLUtils: ofGetGlTypeFromInternal(): unknown internal format 33334, returning GL_UNSIGNED_BYTE

so i added the format in ofGetGLFormatFromInternal so it returns GL_RED for type GL_R32UI
and make ofGetGlTypeFromInternal return GL_UNSIGNED_INT for GL_R32UI

i’m unsure how to get the value out of the texture…
reading it like this:

vec4 val = texelFetch(tex, id); // <-- gives me vector of floats
uint r = uint(val.x); // <-- need to scale?? - val.x is 0

…gives me a value of 0

i haven’t tried to use integer textures in openGL but from what i find googling a bit it seems you don’t need to scale the values. how are you loading the data?

also i’ve fixed the problem with the format in master

nice. :smile:

.
allocating the buffer and texture like this:

// instance variables
ofBufferObject pixelBufferDist;
ofTexture texDist;

// later, setup
pixelBufferDist.allocate();
pixelBufferDist.bind(GL_TEXTURE_BUFFER);    
pixelBufferDist.setData(vector<uint32>, GL_STREAM_DRAW);

texDist.allocateAsBufferTexture(pixelBufferDist, GL_R32UI);
                
// set texture
shader.begin();
shader.setUniformTexture("texDist", texDist, 0);    
shader.end();

updating the data like this:

pixelBufferDist.updateData(0, vector<uint32>);

i think the problem is you need to use usamplerBuffer instead of samplerBuffer for the buffer texture in the shader:

This was an issue on osx and windows. it is fixed nad should not be a problem anymore with current master.
thanks arturo.